History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Wed, 17 Jul 2019 08:31:05 +0000 Wed, 17 Jul 2019 08:31:05 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://w.historynewsnetwork.org/site/feed Alexa: What Can Apollo 11 Teach Us About Our Relationship with Technology?

 

If you haven’t seen Samsung’s Apollo 11-themed television ad for its next-generation 8K TVs, it’s inspired: Families of contrasting backgrounds huddle around the tube in dark-paneled living rooms of the 1960s, eyes glistening with wonder, as they watch Neil Armstrong step onto the lunar surface. As commercials go, it’s a canny ode to American greatness past, and a stroke of advertising genius. It also reminds us that nostalgia makes for a foggy lens.

 

Yes, Apollo 11 was a big deal. Historian Arthur Schlesinger, Jr. rated space exploration as “the one thing for which this century will be remembered 500 years from now … ” and saw the July 1969 moon landing as the key event. Its success sealed America’s standing as the planet’s unrivaled leader in science and technology, and today’s media lookbacks, including major TV documentaries, make that case. The better of these reports highlight the turbulent times in which Apollo took flight, as American cities boiled with protests for racial and economic justice and against the Vietnam War, and concerns about the environment were on the rise. Yet, it’s still easy to gloss over the fact that, for most of the 1960s, most Americans opposed Washington spending billions of dollars on space. 

 

What also gets overlooked is Apollo’s importance as a pivot in our national thinking about science and technology. By the late 1960s, young Americans, in particular, had come to see our rivalries with the Soviet Union—“space race” and “nuclear arms race”—as stand-ins for an ur-struggle between humankind and its machines. Baby boomers, like me, loved their incredible shrinking transistor radios, out-of-this-world four-track car stereos, and Tang, the breakfast drink of the astronauts. But we’d also seen Stanley Kubrick’s “2001: A Space Odyssey” (1968), and knew Hal the computer was not to be trusted. Harvard behavioral psychologist B.F. Skinner wryly fingered the irony of the age: “The real question is not whether machines think but whether men do.” Given the times, a healthy skepticism was in order.

 

In today’s digital age, we have renewed cause for pause given the way our machines have snuggled into our daily lives, a Siri here, an Alexa, smartwatch or home-security system there. The new intimacy begs a question that animated the early days of the Space Age: How do we harness technology’s promethean powers before they harness us?

 

C.P. Snow joined that debate in 1959, when the British physicist and novelist argued that a split between two “polar” groups, “literary intellectuals” and “physical scientists,” was crippling the West’s response to the Cold War. In “Two Cultures,” a landmark lecture at Cambridge University, Snow said “a gulf of mutual incomprehension” separated the sides, “sometimes [involving] … hostility and dislike, but most of all lack of understanding.” Scientists in the U.K. and throughout the West had “the future in their bones,” while traditionalists were “wishing the future did not exist.” A cheerleader for science, Snow nonetheless warned that the parties had better heal their breach or risk getting steamrolled by Russia’s putative juggernaut. 

 

Without “traditional culture,” Snow argued, the scientifically-minded lack “imaginative understanding.” Meanwhile, traditionalists,“the majority of the cleverest people,” had “about as much insight into[ modern physics] as their Neolithic ancestors would have had.” Snow’s point: Only an intellectually integrated culture can work at full capacity to mesh solutions to big problems with its fundamental human values.

 

On this side of the pond, President Eisenhower wondered where the alliance among science, industry and government was propelling America. In his 1961 farewell address, the former five-star general warned of a military-industrial complex that could risk driving the United States toward an undemocratic technocracy. By that time, of course, the Russians had already blunted Ike’s message thanks to their record of alarming firsts, including Sputnik I, the world’s first earth-orbiting satellite, in October 1957, and the dog-manned Sputnik II the next month. The nation’s confidence rattled, Eisenhower had ramped up the space program and launched NASA in 1958.

 

Talk of a technology gap, including a deeply scary “missile gap,” with Russia gave the Soviets more credit than they deserved, as it turned out, but the specter of a nuclear-tipped foe raining nuclear warheads on us was impossible for political leaders to ignore. Meanwhile, the boomer generation got enlisted in a cultural mobilization. Under Eisenhower, public school students learned to “duck and cover.” When John Kennedy replaced him in 1961, our teachers prepared us to confront the Soviet menace by having us run foot races on the playground or hurl softballs for distance; in the classroom, they exhorted us to buckle down on our math and science lest the enemy, which schooled their kids six day a week, clean our clocks.

 

In April 1961, the Soviets sprang another surprise—successfully putting the first human, cosmonaut Yuri Gagarin, into low-Earth orbit. President Kennedy countered on May 25, telling a joint session of Congress that “if we are to win the battle that is now going on around the world between freedom and tyranny …” one of the country’s goals should be “landing a man on the moon and returning him safely to the earth” by the end of the 1960s. It was a bold move, requiring a prodigious skillset we didn’t have and would have to invent.

 

The fact that we pulled off Apollo 11 at all is a testament to American ingenuity and pluck. Yet while the successful moon landing decided the race for space in America’s favor, it didn’t undo our subliminal angst about the tightening embrace of technology.

 

The mechanized carnage of World War II had seen to that. The war had killed tens of millions of people worldwide, including over 400,000 Americans, and the atomic bombs dropped on Hiroshima and Nagasaki opened humankind to a future that might dwarf such numbers. In a controversial 1957 essay, author Norman Mailer captured the sum of all fears: In modern warfare we could well “… be doomed to die as a cipher in some vast statistical operation in which our teeth would be counted, and our hair would be saved, but our death itself would be unknown, unhonored, and unremarked . . . a death by deus ex machina in a gas chamber or a radioactive city. …”

 

As the United States and Soviet Russia kept up their decades-long nuclear stalemate, the American mind wrestled with a sublime paradox: Only modern technology, the source of our largest fears, could protect and pacify us in the face of the dangers of modern technology. 

 

Today, we grapple with a variation on that theme. Fears of nuclear annihilation have given way to concerns less obtrusively lethal but potentially devastating: cyber-meddling in our elections, out-and-out cyberwarfare, and nagging questions about what our digital devices, social media, and today’s information tsunami may be doing to our brains and social habits. In 2008, as the advent of the smartphone accelerated the digital age, technology writer Nicholas Carr wondered about the extent to which our digital distractions had eroded our capacity to store the accreted knowledge, what some call crystallized intelligence, that supports civilized society. The headline of Carr’s article in The Atlantic put the point bluntly: “Is Goggle Making Us Stupid?”

 

Accordingly, MITI social psychologist Sherry Turkle has argued that too much digital technology robs us of our fundamental human talent for face-to-face conversation, reduces the solitude we need for the contemplation that adds quality to what we have to say, and contributes to a hive-mindedness that can curtail true independence of thought and action.

 

That’s a head-twisting departure from the American tradition of the empowered individual–an idea that once inspired our intrepid moonwalkers. In his 1841 essay “Self-Reliance,” Ralph Waldo Emerson advised America to stand on its own two feet and eschew Europe as a source for ideas and intellectual custom; rather, we should establish our own culture with the individual as the sole judge of meaning, and get on with creating a new kind of nation, unshackled by the past. “There is a time in every man’s education,” wrote Emerson, “when he arrives at the conviction that envy is ignorance; that imitation is suicide; that he must take himself for better, for worse, as his portion. …”

 

A half-century later, in a globalizing and technologically more complex world, philosopher William James applied the Goldie Locks principle to citizen-philosophers. For Americans, he argued, European-inspired “rationalism” (being guided by high-minded principle) was too airy, “empiricism” (just the facts, please) was too hard—but “pragmatism, ” (a mix of principles and what really works, with each individual in charge of deriving meaning) was just right. James sought to meld “the scientific loyalty to the facts” and “the old confidence in human values and the resultant spontaneity, whether of the religious or of the romantic type.”

 

Maybe this is what James had in mind when he reached for a description of America’s democratic inner life: “For the philosophy which is so important in each of us is not a technical matter; it is our more or less dumb sense of what life honestly and deeply means. It is only partly got from books; it is our individual way of just seeing and feeling the total push and pressure of the cosmos.”

 

Today, the gap between our technological capabilities and our human means of coping with them is only likely to widen. As Michiko Kakutani pointed out in her 2018 book “The Death of Truth”: “Advances in virtual reality and machine-learning systems will soon result in fabricated images and videos so convincing that they may be difficult to distinguished from the real thing … between the imitation and the real, the fake and the true.”

 

(If you’ve been keeping up with developments in “deepfake” technology, you know that a scary part of the future is already at the disposal of hackers foreign and domestic.)

 

In a sense, today’s digital dilemma is the reverse of what C.P. Snow talked about 60 years ago. Our technology innovators still have the future in their bones, to be sure; but overall, the health of our society may rest on the degree to which we don’t make the world a more convivial place for science per se, but rather deploy our humanistic traditions to make our fast-moving technology best serve and sustain the human enterprise. 

 

At a broader level, of course, Snow was right: “To say we have to educate ourselves or perish, is a little more melodramatic than the facts warrant,” he said. “To say, we have to educate ourselves or watch a steep decline in our own lifetime, is about right.” And we can’t truly do that without prioritizing a more comprehensive partnership between the science that pushes technology ahead and the humanities that help us consider the wisdom of such advances in light of the best that has been thought and said.

 

And there, the Apollo program serves as a helpful metaphor. In 1968, as Apollo 8 orbited the moon in a warm-up run for Apollo 11, astronaut Bill Anders snapped “Earthrise,” the iconic photograph showing our serene blue-white sphere hanging in lonely space. Often credited with helping to launch modern environmentalism, the image underscored what human beings have riding on the preservation of their home planet. The turn in our thinking was reflected 16 months later when Earth Day was born. And ironically, perhaps, the Apollo program spun off technology—advanced computing and micro-circuitry—that helped ignite today’s disorienting digital explosion, but also produced applications for environmental science and engineering, for example, that promote the public good.

 

Meanwhile, our shallow grasp of digital technology presents problems that are deeper than we like to think when we think about them at all. As it stands, most Americans, this one included, have only the shakiest handle on how digital technology works its influences on us, and even the experts are of mixed minds about its protean ways.  

That order of technological gap is what worried Aldous Huxley, author of the classic dystopian novel “Brave New World.” As Huxley told Mike Wallace, in a 1958 interview, “advancing technology” has a way of taking human beings by surprise. “This has happened again and again in history,” he said. “Technology … changes social conditions and suddenly people have found themselves in a situation which they didn’t foresee and doing all sorts of things they didn’t really want to do.” Unscrupulous leaders have used technology and the propaganda it makes possible in subverting the “rational side of man and appealing to his subconscious and his deeper emotions … and so, making him actually love his slavery.”

It may not be as bad as all that with us—not yet, anyway. But it does point back to our central question: Have our digital devices gotten the drop on us—or can we train ourselves to use them to our best advantage? This summer’s Apollo 11 anniversary is as timely an event as any to remind us of what’s at stake in managing digital technology’s race for who or what controls our personal and collective inner space.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172523 https://historynewsnetwork.org/article/172523 0
HBO’s Chernobyl and the Rendering of History

 

While watching HBO’s recent 5-part dramatization of the 1986 Soviet nuclear accident at Chernobyl, which spewed more radioactive material into the atmosphere than Hiroshima and Nagasaki bombings combined, I kept thinking of all the suffering it caused. (Because of the difficulties of determining eventual early deaths due to radiation exposure, we don’t know whether they be in the thousands, tens of thousands, or more.)

 

I also kept thinking of lines from Ian McEwan’s novel Black Dogs (1993):

He was struck by the recently concluded war [World War II] not as a historical, geopolitical fact but as a multiplicity, a near-infinity of private sorrows, as a boundless grief minutely subdivided without diminishment among individuals who covered the continent like dust. . . . For the first time he sensed the scale of the catastrophe in terms of feeling; all those unique and solitary deaths, all that consequent sorrow, unique and solitary too, which had no place in conferences, headlines, history, and which had quietly retired to houses, kitchens, unshared beds, and anguished memories.

Like wars, the Chernobyl accident had all kinds of unforeseen consequences.

 

In an earlier HNN essay, I mentioned that novels, films, or television can sometimes stir our emotions and imaginations more than drier works by professional historians. And truth comes to us not just through our intellects, but also from the affective areas of our personalities. That same essay dealt with the problem of determining truth in fictionalized history. Regarding Chernobyl, Masha Gessen, who is both a U. S. and Russian citizen, provides some guidance

 

She lauds the “uncanny precision with which the physical surroundings of Soviet people have been reproduced.” One example that struck me was a dilapidated sign hanging over a street that read “Our goal is the happiness of all mankind” (also the miniseries title for Episode 4). Such signs were abundant in Soviet Russia. One banner hanging over a street (a photo of which I included in my A History of Russia) urged children returning to school after summer vacation in 1978, to “get ready to become active fighters for the cause of Lenin and for communism.”

 

Although praising background depictions, Gessen faults the miniseries for “its failure to accurately portray Soviet relationships of power.” Too often, it unrealistically depicts “heroic scientists,” especially the fictional Ulyana Khomyuk (Emily Watson), “confronting intransigent bureaucrats by explicitly criticizing the Soviet system of decision-making.”

 

Despite such failures, the Chernobyl episodes do a good job depicting the effects of the tragedy. The suicide of scientist Valery Legasov. The suffering of the young Lyudmilla Ignatenko as she watchs the slow and painful death of her fireman husband, Vasily, and later has her new-born daughter die of the radiation she absorbed while pregnant. The hundreds of miners exposing themselves to Chernobyl radiation—at the end of the series we are informed that “it is estimated that at least 100 of them died before the age of 40.” (All quotes from the miniseries are taken from the episode scripts.)  The young soldier Pavel forced to kill contaminated dogs and other animals. The old woman who refuses to move out of her contaminated home even after a soldier shoots the cow she is milking—some 300,000 people “were displaced from their homes.” And we think, “How could Soviet leaders have been so careless as to allow such a tragedy to occur?”

 

The causes, as usually happens with historical events, were many, and the series mentions some of them. The fifth (and last) episode of the miniseries is devoted mainly to the 1987 trial of three Chernobyl officials whom Soviet authorities claimed were most responsible. The miniseries certainly indicated that they shared some of the blame, but it was also endemic to the Soviet system. 

 

In episode 5 a fictional KGB head tells scientist Legasov that the Chernobyl accident was essentially “the result of operator error.” The KGB and judge at the trial wanted to deflect any suggestion that the Soviet communist system itself was at fault. The judge tells Legasov, “If you mean to suggest the Soviet State is somehow responsible for what happened, then I must warn you—you are treading on dangerous ground.”   

 

Gessen mentions that “the Harvard historian Serhii Plokhy’s 2018 book on Chernobyl . . . . argues, it was the Soviet system that created Chernobyl and made the explosion inevitable.” (See an excerpt of the book here.) In fairness to the miniseries, it does indicate some of that blame.

 

In one scene featuring the three men who were put on trial, one of them tells the other two that the power at Chernobyl could not be lowered to the extent it should have been for the safety test, the failure of which causes the massive nuclear accident. Why couldn’t it be lowered more? “It's the end of the month. All the productivity quotas? Everyone's working overtime, the factories need power.” 

 

As one book on the Soviet environment states, “For the environment, the central planning system became Frankenstein’s monster. . . . The plan and its fulfillment became engines of destruction geared to consume, not to conserve, the natural wealth and human strength of the Soviet Union.” 

 

Fulfilling quotas, whether multi-year, yearly, or monthly ones, generally became more important than safety or quality considerations. (See here for prioritizing the production schedule over nuclear safety at Chernobyl.) In the vast Soviet bureaucratic central-planning system,pleasing your superiors by meeting quotas became an important path for advancement. 

 

That same system discouraged individual initiative, initiative that might have prevented or mitigated the effects of the accident. One individual reporting in May 1986 to the Central Committee of the Communist party about conditions his investigating group discovered at Chernobyl wrote that “we constantly heard the following phrase: ‘We did not receive those instructions from the center.’” He added, “They waited for orders from Moscow.” (In annual summer trips to the USSR in the mid and late 1980s, I frequently observed this reluctance to exercise initiative. In the summer of 1986, for example, just months after Chernobyl, the group I was leading was assigned inadequate lodging when we checked into a hotel in Odessa, a city more distant from Chernobyl than Kiev, where we were originally scheduled to go. When I complained to the hotel manager that the accommodations assigned to us were inferior to those we had arranged and paid for, he informed us that he would have to straighten the matter out with officials in Moscow. It took three hours to do so. Only then were we assigned proper lodging.) 

 

At the trial mentioned above, Legasov indicates still other failings of the Soviet system, especially its secrecy and lies. “They are,” he says, “practically what defines us. When the truth offends, we lie and lie until we cannot even remember it's there.” He also mentions that to save money various safety measures, like having containment buildings built around the reactors, were not taken.  

 

In his Memoirs, published after he no longer headed the Soviet Union (1985-1991), which itself had disintegrated, Mikhail Gorbachev alluded to the Soviet failings indicated above. He wrote: 

The accident at the Chernobyl nuclear power plant was graphic evidence . . . of the failure of the old system. . . . 

The closed nature and secrecy of the nuclear power industry, which was burdened by bureaucracy . . . had an extremely bad effect. I spoke of this at a meeting of the [Communist party] Politburo on 3 July 1986: ‘For thirty years you scientists, specialists, and ministers have been telling us everything was safe. . . . But now we have ended up with a fiasco.  . . . Throughout the entire system there has reigned a spirit of servility, fawning, clannishness and persecution of independent thinkers. . . .

Chernobyl shed light on many of the sicknesses of our system as a whole. Everything that had built up over the years converged in this drama: the concealing or hushing up of accidents and other bad news, irresponsibility and carelessness, slipshod work, wholesale drunkenness. This was one more convincing argument in favor of radical reforms. 

 

Gorbachev himself is depicted in the miniseries as someone more interested in discovering the truths of Chernobyl than in covering them up. And in general that was true, but he inherited a Soviet system that was not much interested in truth or justice, and he had to contend with many government and Communist party officials opposed to some of the radical reforms he pushed, such as more openness, less censorship, and economic restructuring.

 

Media and the courts were strictly controlled by the Communist party and government. About the Chernobyl trial, Legasov says, “It's a show trial. The ‘jury’ has already been given their verdict.” One indication that such was the usual practice was the observation of an earlier Soviet dissident that among 424 known political trials in the decade following 1968, there were no acquittals in any of them.

 

But the value of HBO’s Chernobyl is not just what it tells us about the failings of the Soviet system. It offers much more. In 2011, the Bulletin of the Atomic Scientists published an essay by Gorbachev that listed some of the lessons that the world could learn from the accident. One was that we “must invest in alternative and more sustainable sources of energy” like wind and solar. Surely, another lesson is the need for caution in developing any powerful technology. In June 2019, conservative New York Times columnist Bret Stephens wrote an op-ed entitled “What ‘Chernobyl’ Teaches About Trump.” In it he compared the effects of Trump’s many lies to those of Communist officials. Wikipedia offers us a convenient overview of the miniseries, including a summary of each episode, and most significantly links to various other essays that comment on the miniseries and its relevance for today. 

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172521 https://historynewsnetwork.org/article/172521 0
Say It Ain't So Joe: Strategies of Segregation in Ventura County

A protest of desegregation busing in Boston, 1974

 

 

From the last Democratic debate, we learned that in the 1970s Joe Biden opposed federally mandated busing to desegregate schools because he believed it was a dilemma to be reckoned with by local government.

To be fair to Joe, most people—black, brown, and white—at the time liked their neighborhood schools. 

 

White-collar professionals purchased homes significantly based on the public school that came with them—historically better resourced than those in black and brown communities systematically concentrated in the nation’s inner cities. Less affluent minority parents, too, simply desired equally funded neighborhood schools with effective teachers friendly to the needs of their children. And for many others, a culturally relevant curriculum that instilled an amour propre in students from diverse backgrounds was a plus.  Largely absent from today’s public conversation on mandated busing is the racism that created segregated neighborhoods in the first place and translated to poorly funded schools for minority children. As Eric Avila in Popular Culture in the Age of White Flight: Fear and Fantasy in Suburban Los Angeles (2004) and Richard Rothstein in The Color of Law: A Forgotten History of How Our Government Segregated America (2017) detail, in the 1930s, Federal Housing Authority policy, via the Home Owner’s Loan Corporation, created a redlining system that encouraged real estate interests (lenders, developers, and agents) to concentrate people of color away from white homeowners. This was the history behind the desegregation case of Soria v. Oxnard School District Board of Trustees (1971) in Ventura County, California. As David G. García details in Strategies of Segregation: Race, Residence, and the Struggle for Educational Equality (2018), since the 1930s, the Oxnard School District accommodated white homeowners who did not want their children socializing with Mexican children, primarily, as they were the largest non-white demographic. As limited funding and facilities made complete segregation impossible, OSD administrators, upon the direction of trustees, gerrymandered attendance boundaries and schedules to separate students as much as possible. To maintain this system, during the next two decades the OSD constructed two segregated Mexican schools in the 1940s less than one block away from each other. When these sites overcrowded, the district imported portables classrooms and constructed new campuses nearby. Ten years after Brown v. Board of Education 1954, the Community Service Organization, an ethnic Mexican civil rights group, and the National Association for the Advancement of Colored People of Ventura County protested the segregationist practices of the OSD trustees. The district contended that de facto school segregation was an outcome of residential patterns outside its purview. As the City of Oxnard grew, the CSO and NAACP persistently petitioned the OSD to remedy racial imbalances in the schools. The board rejected all of the numerous desegregation plans proposed by Althea Simmons, field secretary of the Los Angeles chapter of the NAACP andits own advisory committee. Fed up with the intransigence of OSD trustees, black and ethnic Mexican parents filed the Soria case in 1970 in federal court. In May 1971, Judge Harry Pregerson’s  summary judgement found that both de facto and “de jure overtones” of segregation consisted of, but were not limited to, the creation of new schools, individual intra-district transfers via busing, and the use of portables to keep black and brown students concentrated in segregated schools.

 

These were constitutional violations of equal protection under the 14th Amendment. As a result, Judge Pregerson mandated a paired-schools busing plan as a remedy.  That September buses transported children of the barrio to their paired schools in the city’s more middle-class neighborhoods and vice versa. Like Kamala Harris in Berkeley at this time, as a first grader I, too, was bused from an ethnically integrated neighborhood of black, brown, and Asian American families in south Oxnard to Brittell Elementary in the predominantly white, northern part of the city.  In November of 1973, the U.S. Ninth Circuit Court of Appeals vacated Judge Pregerson’s summary judgment and remanded the case for a trial. Subsequently, board minutes of the 1930s surfaced that evidenced the de jure segregation of Mexican children to appease white parents. Former OSD superintendents, including Los Angeles County Superintendent of Schools Dr. Richard Clowes, also testified that up to and throughout the 1960s, trustees maintained segregation. Based on prior and fresh findings, Judge Pregerson ruled in favor of the plaintiffs and busing continued. The need to bus students faded through the 1980s as the City of Oxnard increasingly browned. Its cause: a middle-class flight of diverse races and ethnicities to the neighboring communities of Camarillo and Ventura. But as the demographics of these communities shifted over time, flight renewed. People moved further eastward, if able, to Newbury Park and Thousand Oaks. Hence, a more insidious segregation exists today as people of all colors and creeds troll education and real estate websites for school rankings. The systemic outcome: the segregation of largely black and brown students, again. If a school’s status dips and the number of brown students rises, some parents, if they can, will take one of the following steps: move to whiter more affluent neighborhoods or commute their children to higher performing and less racially diverse schools. Without the segregationist mentors who Joe Biden proudly worked with as U.S. Senator in the 1970s, this is the new face of de facto school segregation.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172519 https://historynewsnetwork.org/article/172519 0
Delightful but Dizzying Romp as the 17th Century Meets the 1950s  

Viola is a lovely young blonde-haired woman washed up on a sandy shore after a shipwreck that took the life of her brother Sebastian. Now alone in the world, and seeking a mate, she disguises herself as a man and goes to work for the wealthy and powerful (and very handsome) Duke Orsino in Illyria. She falls in love with him. The Duke, though, is smitten with Lady Oliva and chases her fervently while Viola chases him. So, you have a man, the Duke, chasing a Lady, Oliva, and a man chasing the Duke. Or is it a woman chasing the Duke? Or is anybody really chasing the Duke? Is everybody chasing him?

 

There is merriment galore in this new production of William Shakespeare’s play Twelfth Night, that just opened at the Shakespeare & Co. theater in Lenox, Massachusetts in the Berkshires. There is also a bit of dizziness too, as you try to figure out who is who. There are other characters who are trying to help or hinder the romances of this smart trio. They contest against each other and conspire with each other throughout the tale. You need a scorecard.

 

This production ofTwelfth Night has an unusual setting – a seaside nightclub in the year 1959 to represent, I guess, fun and frolic just before the turbulent 1960s. As you find your seat you are happily bombarded with rock and roll music from 1959, tunes such as Venus, by Frankie Avalon.

 

All of this makes for an enjoyable night at the theater, but you get lost trying to follow the plot and trying to figure out the motives of all the very odd characters in the story and a lot of hidden wrinkles that should remain hidden – very hidden. 

 

Twelfth Night has been staged in many different ways. It is often set on a ship, as an example, and the sea surrounds the actors. It has been set in different centuries and folks travel by cars, horses and carriages. Here it is the Bard rocking and rolling to the beat of Pink Shoelaces ,by Dodie Stevens, and Rock Around the Clock by Bill Haley and the Comets.

 

Our heroine, Viola, is really two people, a man and woman. She becomes the sexual object of desire, as a man, for Lady Oliva. Now Oliva is being pursued ardently by a very surprising mystery man. For Viola, wrestling with Oliva throughout the play as a guy and pining for Orsino as a woman (you’ve got to pay attention here), the question is – what to do?

 

She bumps into a group of men who tell jokes, make sarcastic remarks and sing a lot. They sing original songs written for the play and they, and the audience, listen to a long list of 1950s songs that sometimes have something to do with the play and are a part of this 1601 Dick Clark’s American Bandstand television show.

 

The pace of the story is faster than the Indianapolis 500 auto race and you have to keep up with three or four subplots at the same time and try, try, try, to find Duke Orsino, who wanders around 1959 like a man in search of an Esso gas station. All of the characters tease and torture poor old Malvolio, one of Shakespeare’s great comic characters, who is wonderful and involved in the hidden sub plot.

 

Although the plot is a bit mixed up, there are many good thigs to say about this Twelfth Night. Director Allyn Burrows has done a very admirable job of mastering the story and, although unwieldy, keeps the tale moving along and milks every bit of comedy out of it in addition to directing a fine cast of actors. There are wonderful small scenes involving small characters, such the mirthful Sir Andrew and his pals Sir Toby and Feste, who does some fine singing throughout the play. They add some spark to the story.

 

The director gets really superb work from the cast. Ella Loudon is triumphant as the woman/man Viola. She is on stage for practically the entire play and does yeoman work.  Miles Anderson is just a vision as scampy Malvolio, who has the audience chuckling for the length of the play. Other good performances are turned in by Martin Jason Asprey as the sea captain who rescues Viola from the shipwreck, Bruce Michael Wood as heartthrob Orsino, Steven Barkhimer as Sir Toby, Gregory Boover as Feste, Nigel Gore (marvelous) as Sir Andrew, Deacon Griffin Pressley as the mysterious suitor of Lady Oliva. When Olivas is not shouting too much, she is a welcome addition to the play and portrayed nicely by Cloteale L. Horne. Bella Merlin plays Oliva’s servant and possesses the world’s loudest and longest cackle.

 

If you see this Twelfth Night, bring a scorecard to keep track of the characters and the plot and be prepared for a long two-and-a-half-hour play. Oh, and bring your dancing shoes. You may not be able to follow Viola, but you certainly can follow the rock and roll songs in the story.

 

PRODUCTION: The play is produced by Shakespeare and Co. Sets: Christina Todesco, Costumes: Govane Lohbauer, Sound and original music: Arshan Gailus, Lighting: Deb Sullivan, Fight Director: Allyn Burrows. The play is director by Burrows. It runs through August 4.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172552 https://historynewsnetwork.org/article/172552 0
Will We all Survive by the Skin of Our Teeth?

 

In 1942, just after the start of World War II, playwright Thornton Wilder, who had written the classic Our Town just five years earlier, wrote The Skin of our Teeth. It was very unusual then and pretty unusual now. The play is the story of a family of mammals, that has evolved into people, over 5,000 years. They are the Antrobuses, a family of mom and dad, who do not get along, and a daughter and son who is rebel. The dad, George, is the President of the mammal society and has been since the days of Noah. The family has dealt with tragedy for all those centuries and now they are smacked with World War II.

 

The play, that just opened at the Fitzpatrick Theater, Berkshire Theater Group, in Stockbridge, Massachusetts, begins with the family living in the Ice Age and surrounded by large mammoths and other ancient animals. Act one tries to show that the mammals were more civilized than people. It also sets up the idea that this is a unique play, a bit weird, maybe, and, in the end, the story not just of humankind, particularly Americans, but of what will happen to humankind if it does not change its ways (as if we are EVER going to change our ways).

 

It is a strange look at the world, very surrealistic, and something you would expect to be written today, and not in 1942. In it, playwright Wilder shows an amazing vison of the future. Much of it happened, although vaguely, just the way he predicted.

 

Despite its odd structure and tromping, stomping, mammoths, who wander about sort of half off the stage, their roars frightening everybody, The Skin of Our Teeth is a colossal success, a perceptive look back to the past and to the future and from post-World War II life to the present day and to our future, too. 

 

In the beginning of the story, someone reminds the audience that Americans got through the Depression “by the skin of our teeth” and that we can get through anything, at that time referring to the war. You could jump into 1942’s future, though, and look at Watergate, Vietnam, the Civil Rights movement and even today’s border crisis to see how Americans did get through everything, and in much the way Wilder predicted. Oh, we’ve had our and bruises, but we made it so far. Wilder then adds that if we have gotten this far, we can get farther. He’s right.

Following the sci-fi start of the show in act one, the tale shifts to contemporary 1942 at America’s most glamourous resort, Atlantic City (that has fallen into gambling disrepair over the last forty years). George and his wife are the leaders of a mammal’s society but have their domestic problems, and their two troublesome teenagers (teenagers were troublesome 5,000 years ago and remain so today, and will be a pain 5,000 years from now).  Middle-aged George, married so long, thinks about having a fling with Miss Atlantic City, his kids rebel and his wife seems fed up with everything and everybody. They tell people that when the big machine on stage turns red the world will be coming to an end. It does and the world seems to be rushing that way, given the war, but it survives. In act three, in a dazzling performance by the teenage son Henry, we see the future after World War II and it is not rosy.

 

The Skin of Our Teeth is not an easy play to stage or watch. At the start of act three, for no reason, the action stops and the stage manager tells us, as part of the play, that actors have become ill and new actors have been put on stage. There is a rehearsal for the new actors. The play then reverts back to form with the newcomers. There are trips to 5,000 years ago, Conventions, beaches, strobe lights, explosions, boardwalks and lots of cantankerous people. 

 

Director David Auburn has done a splendid job of keeping all of this running smoothly in his skilled hands. He also remembers the past warmly but meets the future with open arms and an eager smile, as do the actors. Auburn gets wonderful performances from the entire cast. Particularly good performances are turned in by Danny Johnson as George, Harriet Harris as his wife, Ariana Venturi as daughter Sabina, Marcus Gladney Jr. as son Henry, and (delightful) Lauren Baez as the Antrobus’ maid.  They are surrounded by an ensemble of fine actors.

 

At the end of the play, George Antrobus looks out over the audience and says that the end of the play, earth’s resolution, has not been written in 1942. It has not been written today, either, and will not be until another 5,000 or 10,000 years has gone by. People struggle on, sometimes triumphant and sometimes tragically, but we do survive.

 

What will the world be like 10,000 years from now?

 

Will there still be robot calls?

 

Will there still be 546 candidates in each televised Presidential debate? 

 

Will parents still tell everybody that their kids are the smartest, most athletic and most beautiful people that ever lived?

 

PRODUCTION: The play is produced by the Berkshire Theatre Group. Sets: Bill Clarke, Costumes: Hinter Kaczorowski, Sound: Scott Killian, Lighting: Daniel J. Kotlowitz. The play is directed by David Auburn. It runs through August 3.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172553 https://historynewsnetwork.org/article/172553 0
Mary Jo Kopechne’s Legacy

 

 

On July 19, 1969, Senator Edward M. Kennedy drove his black Oldsmobile off a bridge on Chappaquiddick Island, near Martha’s Vineyard, Massachusetts. His 28-year-old passenger, Mary Jo Kopechne, was killed in the accident.                                    

 

A week later, Kennedy went on national television to ask the people of Massachusetts for their forgiveness. And they forgave him each of the  six times he was reelected to the Senate until his death from brain cancer in 2009. History has not been as kind to Kopechne.                                                                                           

 

For fifty years, Mary Jo was treated as collateral damage by the Kennedys and the Washington political establishment.  The media spun the tragedy as part of a much larger “curse” on the Kennedy family, and one that prevented the Massachusetts senator from being elected to the presidency. Other, more sensationalist writers suggested that Kopechne was an opportunist who was having an affair with the senator.                                                                                          

 

But Kopechne’s life and legacy are much greater than her death at Chappaquiddick and the cottage industry of scandalous accounts that followed over the next half century.  Mary Jo was a bright young woman who was a pioneer for a later generation of female political consultants, including Mary Matalin, Ann Erben and Donna Lucas.                                                                                                    

 

Inspired by President John F. Kennedy's challenge to "ask what you can do for your country," Mary Jo Kopechne became part of the civil rights movement, taking a job as a school teacher at the Mission of St. Jude in Montgomery, Ala. Three years later, she joined the Capitol Hill staff of New York Sen. Robert F. Kennedy.                                                                                       

 

A devout Catholic, Kopechne lived in a Georgetown neighborhood with three other women. She rarely drank, didn’t smoke and was offended by profanity, yet she was irresistibly drawn to the fast-paced, glitzy world of Washington.                                

 

Mary Jo distinguished herself during RFK’s by working long hours at his Washington headquarters.  During Bobby’s 1968 presidential campaign, Kopechne served as a secretary to speechwriters Jeff Greenfield, Peter Edelman and Adam Walinsky and tracked and compiled data on how Democratic delegates from various states might vote.  She shared the latter responsibility with five other young women: Rosemary Keough, Esther Newberg, Nance and Maryellen Lyons, and Susan Tannenbaum.  Collectively, they were known as the "Boiler Room Girls,” after the windowless office they worked in at 2020 L Street in Washington, DC.                                               

 

At age 27, Mary Jo was the oldest of the Boiler Room Girls and the one who had worked for RFK the longest. She was the key Washington contact in the Boiler Room.  She also kept track of delegates in Indiana, Kentucky and Pennsylvania, critical battleground states where polls were predicting a close race between Kennedy and Vice-President Hubert Humphrey.                                                    

 

Mary Jo was only paid $6,000 a year when she was hired, compared to the male legislative assistants who started at a salary of between $12,000 and $15,000 a year.  In the four years she worked for RFK she never earned more than $7,500 a year; enough to pay rent and maintain a Volkswagen Beetle.                                 

 

By today’s standards, Kopechne was grossly underpaid, extremely overworked and dismissed as a “secretary” when her responsibilities demanded the more respectable title of “political consultant” and paid accordingly. But she belonged to a transitional generation of women who paved the way for the feminists of the 1970s and their fight for gender equality.                                                     

 

Of all the Boiler Room Girls, Kopechne was the “the most politically astute,” according to Dun Gifford, who supervised the operation.  “Mary Jo had an exceptional ability to stay ahead of fluctuating intelligence on delegates. That ability allowed her to negotiate deals on RFK’s behalf, to travel with him when necessary and even to offer her opinions when she had the best working knowledge of a situation.      

                                                                                                   

“Had Bobby won the election, Mary Jo would have been rewarded with a very significant job in his administration,” added Gifford. Kopechne was devastated by RFK's assassination in June 1968, and she felt she could no longer work on Capitol Hill. Instead, she took a job with a political consulting firm in Washington, D.C.                                                                             

 

On the evening of July 18, 1969, Mary Jo attended a party thrown by Ted Kennedy on Chappaquiddick to honor her and the other Boiler Room Girls. Later that night, she accepted the senator's offer to drive her back to her hotel on Martha’s Vineyard. Kennedy's car swerved off a narrow, unlit bridge and overturned in the water. The senator escaped from the submerged car, but Kopechne died after what Kennedy claimed were "several diving attempts to free her."                                                   

 

By the time Kennedy reported the accident to police the following morning, Kopechne's body had been recovered. John Farrar, the diver who found her, reported that she had positioned herself near a backseat wheel well, where an air pocket had formed, and had apparently suffocated rather than drowned. Farrar added that he "could have saved her life if the accident had been reported earlier."  Kennedy could easily have been charged with involuntary homicide and sentenced to significant jail time for Kopechne's death. Judge James Boyle instead sentenced him to two months' incarceration, the statutory minimum for the crime. He then suspended the sentence saying that Kennedy had "already been, and will continue to be, punished far beyond anything this court can impose."                             

 

Perhaps Ted Kennedy tried to do his penance in the United States Senate, where he championed historic legislation on civil rights, immigration, education, and health care.                                                                                                                  

 

If so, Mary Jo Kopechne inspired those achievements because her death forced the embattled senator to strive for a higher standard.

 

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172522 https://historynewsnetwork.org/article/172522 0
Billionaires and American Politics

 

Is the United States becoming a plutocracy?

With the manifestly unqualified but immensely rich Donald Trump serving as the nation’s first billionaire president, it’s not hard to draw that conclusion.  And there are numerous other signs, as well, that great wealth has become a central factor in American politics.

Although big money has always played an important role in U.S. political campaigns, its influence has been growing over the past decade.  According to the Center for Responsive Politics, by 2014 the share of political donations by the wealthiest 0.01 percent of Americans had increased to 29 percent (from 21 percent four years before), while the top 100 individual donors accounted for 39 percent of the nation’s super PAC contributions.  

With the 2016 presidential primaries looming, would-be Republican nominees flocked to Las Vegas to court billionaire casino magnate Sheldon Adelson and his wife, who had donated well over $100 million to Republican groups during the 2012 election cycle.  Although even Adelson’s money couldn’t save them from succumbing to vicious attacks by Trump, Adelson quickly forged a close alliance with the billionaire president. In 2018, he became the top political moneyman in the nation, supplying Republicans with a record $113 million

In fact, with Adelson and other billionaires bringing U.S. campaign spending to $5.2 billion in that year’s midterm elections, the big-ticket players grew increasingly dominant in American politics.  “We like to think of our democracy as being one person, one vote,” noted a top official at the Brennan Center for Justice.  “But just being rich and being able to write million-dollar checks gets you influence over elected officials that’s far greater than the average person.”

This influence has been facilitated, in recent years, by the rise of enormous fortunes. According to Forbes ― a publication that pays adoring attention to people of great wealth―by March 2019 the United States had a record 607 billionaires, including 14 of the 20 wealthiest people in the world.  In the fall of 2017, the Institute for Policy Studies estimated that the three richest among them (Jeff Bezos, Bill Gates, and Warren Buffett) possessed more wealth ($248.5 billion) than half the American population combined.  

After this dramatic example of economic inequality surfaced in June 2019, during the second Democratic debate, the fact-checkers at the New York Times reported that the wealth gap “has likely increased.” That certainly appears to be the case. According to Forbes, these three individuals now possess $350.5 billion in wealth―a $102 billion (41 percent) increase in less than two years.

The same pattern characterizes the wealth of families.  As Chuck Collins of the Institute for Policy Studies recently revealed, Charles and David Koch of Koch Industries (their fossil fuel empire), the Mars candy family, and the Waltons of Walmart now possess a combined fortune of $348.7 billion―an increase in their wealth, since 1982, of nearly 6,000 percent.  During the same period, the median household wealth in the United States declined by 3 percent.

Not surprisingly, when billionaires have deployed their vast new wealth in American politics, it has usually been to serve their own interests.

Many, indeed, have been nakedly self-interested, sparing no expense to transform the Republican Party into a consistent servant of the wealthy and to turn the nation sharply rightward.  The Koch brothers and their affluent network poured hundreds of millions (and perhaps billions) of dollars into organizations and election campaigns promoting tax cuts for the rich, deregulation of corporations, climate change denial, the scrapping of Medicare and Social Security, and the undercutting of labor unions, while assailing proposals for accessible healthcare and other social services.  And they have had substantial success.  

Similarly, billionaire hedge fund manager Robert Mercer and his daughter, Rebekah, spent $49 million on rightwing political ventures in 2016, including funding Steve Bannon, Breitbart News, and Cambridge Analytica (the data firm that improperly harvested data on Facebook users to help Trump’s campaign).  After Trump’s victory, Robert stayed carefully out of sight, sailing the world on his luxurious, high-tech super yacht or hidden on his Long Island estate.  But Rebekah worked on the Trump transition team and formed an outside group, Making America Great, to mobilize public support for the new president’s policies.

The story of the Walton family, the nation’s wealthiest, is more complex.  For years, while it fiercely opposed union organizing drives and wage raises for its poorly-paid workers, it routinely channeled most of its millions of dollars in campaign contributions to Republicans.  In the 2016 elections, it took a more balanced approach, but that might have occurred because Hillary Clinton, a former Walmart director and defender of that company’s monopolistic and labor practices, was the Democratic standard-bearer.

Although some billionaires do contribute to Democrats, they gravitate toward the “moderate” types rather than toward those with a more progressive agenda.  In January 2019, an article in Politico reported that a panic had broken out on Wall Street over the possibility that the 2020 Democratic presidential nominee might go to someone on the party’s leftwing.  “It can’t be Warren and it can’t be Sanders,” insisted the CEO of a giant bank.  More recently, billionaire hedge fund manager Leon Cooperman made the same point, publicly assailing the two Democrats for their calls to raise taxes on the wealthy. “Taxes are high enough,” he declared. “We have the best economy in the world. Capitalism works.”

The political preferences of the super-wealthy were also apparent in early 2019, when Howard Schultz, the multibillionaire former CEO of Starbucks, declared that, if the Democrats nominated a progressive candidate, he would consider a third party race.  After Schultz denounced Warren’s tax plan as “ridiculous,” Warren responded that “what’s `ridiculous’ is billionaires who think they can buy the presidency to keep the system rigged for themselves.”

Can they buy it? The 2020 election might give us an answer to that question.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172520 https://historynewsnetwork.org/article/172520 0
5 Times Presidents Lost Big in the Midterms But Won (or nearly won) Reelection

 

By any standard of measure, the 2018 midterm congressional elections created a blue wave that swept major regions of the country. With a record voter turnout for midterm elections, eight million more people cast ballots for Democrats than Republicans. That translated into a Democratic majority in the House of Representatives with a gain of 40 seats. 

 

Attention now turns to 2020 and its presidential election. What happens in the next presidential election when a first-term president’s party is rejected by the American people? In the Post-World War II era, a president has won reelection after his party suffered significant losses in the midterms. Analyzing these instances may give us a clue as to whether a Trump second term is in our future. 

 

Democrat Harry Truman was the first post-war president whose party suffered a major defeat in the midterms. Assuming the presidency less than three months after becoming vice-president, Truman faced a myriad of domestic problems stemming from the transition from World War II to a peace-time economy. Inflation and labor unrest including a nationwide railroad strike consumed the new presidency. The American people went to the polls in the 1946 midterms and resoundingly punished the party in power. The Democrats lost their majorities in both the Senate and House of Representatives dropping 10 and 54 seats respectively. Believing the election demonstrated the public wanted a Republican president, Arkansas Senator J. William Fulbright called on Truman to resign so that the new Speaker of the House Joe Martin would assume the presidency.

 

Truman did not take that advice and instead rose from political ashes to win in 1948 by reinforcing the government programs and policies that proved popular during FDR’s New Deal. In September 1945, Truman proposed his 21 Point Program including a minimum wage extension and increase, expanding public works, and strengthening unemployment compensation, housing subsidies, and farm price supports and subsidies. Truman later proposed universal healthcare. Although almost none of these proposals were law in 1948, Truman nonetheless made social welfare the cornerstone of his platform. In his acceptance speech in the Democratic Convention of 1948 Truman said, “Republicans approve of the American farmer, but they are willing to help him go broke. They stand four-square for the American home—but not for housing. They are strong for labor—but they are stronger for restricting labor's rights. They favor minimum wage—the smaller the minimum wage the better.” The promise of expanded government programs was popular with various segments of American society and Truman barnstormed the country and won a presidential term. 

 

28 years later, Republicans experienced heavy losses in the 1974 midterm elections. Only three months earlier, Richard Nixon resigned in disgrace leaving the White House to his appointed vice-president Gerald Ford. When the American people went to the polls in November 1974, inflation was in the double-digits and—even worse—the seemingly honest, untainted President Ford pardoned his predecessor one month to the day after Nixon announced his resignation. The results were devastating:  the Republicans lost 48 seats in the House and 3 in the Senate. 

 

Like pessimistic Democrats in 1946, some Republicans viewed the 1974 results as a warning that Gerald Ford should not be the party standard bearer in 1976. But Ford had a reputation for honesty and conservative principles and fought the Democratic controlled Congress as he vetoed 60 bills. Although unemployment was fairly high, inflation had dropped into single-digits by 1976 and Ford successfully repelled Ronald Reagan’s strong bid for the presidential nomination. After falling way behind his Democratic opponent Jimmy Carter, Ford nearly caught up and won 48 percent of the popular vote. A shift of a few thousand votes in Ohio and Mississippi would have given Ford a majority in the Electoral College.

 

Ronald Reagan also won reelection despite Republican losses in the preceding midterms. During the 1982 elections, the nation was in the worst recession since the Great Depression. With inflation declining but unemployment nearing 11 percent, the Republicans lost 26 House seats but broke even in the Senate and maintained a slim majority. In the next two years unemployment declined and inflation remained under control and Reagan’s massive tax cut seemed to be working. Reagan, remaining as likeable as ever to most Americans, easily won re-election in 1984.

 

More strikingly, the Democrats lost 54 seats in the House and 10 in the Senatein the 1994 midterms during President Bill Clinton’s first term. Republicans won a majority in the House for the first time in 40 years and eight years in the Senate. Anti-Democratic groups such as right-to-lifers, term-limit supporters, and the NRA got out the vote but the Democrats did not. Republicans began calling Clinton irrelevant and predicted his easy defeat in 1996. 

 

A government shutdown likely aided Clinton’s reelection bid. In 1995, the Republican House of Representatives under Speaker Newt Gingrich proposed a U.S. budget that would limit what the federal government could do for the environment, education and many other areas. Worst of all for Republicans, the budget would have increased Medicare premiums. When Clinton vetoed the budget, a government shutdown occurred in 1995-96. By the time Congress reached an agreement, most Americans blamed the Republicans. Clinton ran for re-election by asking the voters, “Do you want the same people who would endanger your Medicare to also have the White House?” Furthermore Clinton won re-election by appearing mainstream or moderate by signing welfare reform and the Defense of Marriage Act. He expanded his base.

 

In the midterm elections during Barack Obama’s first term in 2010, the Democrats lost 63 seats and their majority in the House. Although Democrats narrowly maintained the majority in the Senate, they lost six seats. The Republicans mobilized the voters by claiming that Obama’s $831 billion stimulus package did nothing to end the recession and criticized “Obamacare,” the newly passed health care legislation. 

 

By November 2012, unemployment had dropped two points and at least a million more Americans had obtained health insurance under Obamacare. Furthermore, the Republicans and their presidential nominee Mitt Romney played to a very conservative base without expanding it. The coalition of ethnic and racial minorities that helped elect Obama in 2008 had incentives to stay together and succeeded again.

 

The post-war history of midterm elections contains lessons for Trump and his campaign advisors. Presidents Truman, Reagan, Clinton, and Obama all saw their parties defeated badly in midterms in their first term, yet two years later the voters rewarded them with a second presidential term. President Ford came close to doing that too. 

 

Given this history, what are Trump’s 2020 prospects? Truman succeeded by fighting against a “do-nothing” Congress and fighting for government programs popular with many Americans. But Trump’s border wall has gained no traction with the American people and many will likely blame him for the ongoing government shutdown. Reagan’s likeability coupled with a rebounding economy saved his presidency. Trump, however, remains strongly disliked by two thirds of the American people. Clinton moved toward the center attracting moderate voters who comprised a large segment of the electorate in a general election. Trump shows no sign of taking moderate or broader positions on any issue from immigration to healthcare. So while presidents have previously overcome midterm election losses, Trump isn’t following any of the previously established paths to rebound and win reelection. 

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172524 https://historynewsnetwork.org/article/172524 0
The Civil War Battle Decided By A General's Mistakes

Downtown Lynchburg circa 1919

 

Lynchburg, Virginia, today displays many markers of its Civil War history. There are several signs in and around the city that indicate where Confederate forces were placed in defense of the city. There is a statue of a Confederate infantryman at the top of Monument Terrace. In Riverside Park, what is left of the hull of Marshall, which carried the body of dead Gen. Stonewall Jackson through Lynchburg, sits in an enclosed display near James River. At the junction where Fort Avenue splits, just beyond Vermont Avenue, and Memorial Avenue begins, there is a monument to Jubal Early. There too sit Fort Early just to the west of that juncture and Fort McCausland on what is today Langhorne Road, each having served as part of the outer defenses of Lynchburg, and each considered a Confederate hero of the battle. Finally, Samuel Hutter’s Sandusky residence, the headquarters of the Union’s leaders during the campaign, sits undisturbed down Sandusky Drive to the north of Fort Avenue.

 

Why does Lynchburg display so many markers, ultimately in the cause of a losing effort?

 

It was one of the most important Southern cities in the conduct of the war and was the only major Virginian city not to be captured by the North. By analysis of how the city was not captured, we gain an additional perspective on the military history of the Civil War and see in this instance the large consequences of following one’s inclinations instead of one’s orders.

 

Lynchburg, Virginia, was deemed by both North and South to be a pivotal city in the conduct of the Civil War. It was centrally located in Virginia, where much of the fighting was conducted. Furthermore, it was, with its three railroads and the Kanawha Canal, an extraordinary transportational hub. Confederate troops would gather in Lynchburg to be sent via railroad to other places, and it was a strategic hub for supplies, and, with its numerous warehouses, a place to bring wounded soldiers from the South with the hopes of convalescence. Moreover, it was also near Richmond, the capital of the Confederacy. Yet with a wall of mountains to its northwest and the James River to the east, it was defensible.

 

In spite of its significance, it was not until the summer of 1864 that the Civil War came to Lynchburg. Union Gen. David Hunter was tasked with capturing Lynchburg. Hunter had a history of conducting military affairs as he saw fit to do so, not as he was ordered to do. For instance, in 1862, he issued, without proper authority, an order to free all slaves in Georgia, Florida, and South Carolina. That order was quickly countermanded by President Lincoln. He also, and without authorization, began enlisting black soldiers from South Carolina to form 1stSouth Carolina. Lincoln again rescinded that order.

 

In June 1864, Hunter and his men approached Lynchburg after leaving the Shenandoah Valley. The Confederate convalescents from the hospitals, under the command of invalid Gen. John C. Breckinridge, erected breastworks around the city in some effort to defend it and its citizens.

 

Hunter was under orders to destroy the railroad and the canal at Lynchburg and generally to follow a scorched-earth policy vis-à-vis all industries that might be used to benefit the South on his way through Staunton to Lynchburg. Union Gen. Ulysses Grant wrote to Hunter: “The complete destruction of [the railroad] and of the canal on the James River are of great importance to us. You [are] to proceed to Lynchburg and commence there. It would be of great value to us to get possession of Lynchburg for a single day.” In a single day, Lynchburg’s infrastructure could be annihilated, thereby crippling the South’s capacity to transport goods, soldiers, and the ill and wounded.

 

Yet again Hunter did as he pleased, not as he was commanded to do. As he moved southwest from Staunton, he tarried so that he could burn or destroy almost everything on his path to Lynchburg. He was sidetracked by several raids in Lexington where he remained from June 11 to June 14. He burned down the Virginia Military Institute and plundered Washington College—he even took a statue of Washington as part of this booty—and had plans to raze even more as he traveled—for instance, the University of Virginia. These raids were likely his undoing in the battle. Darrell Laurant writes in “The Battle of Lynchburg”: “The invaders were thwarted for a number of reasons, but chief among them, there was the failure of commanding general Hunter to cut this vital rail line north of the city when he had the opportunity.”

 

Lynchburg, before the arrival of Confederate troops, was protected only by some 700 convalescent soldiers, under active command of the lame Gen. Francis Nichols. Thus, General Robert E. Lee ordered Gen. Jubal Early to assist the invalids Breckinridge and Nichols to defend Lynchburg. Breckinridge had Gen. D.H. Hill set up breastworks around the city. They too were aided by McCausland, who arrived in Lynchburg ahead of Hunter, and by John Imboden, who had a small remnant of cavalry, and the two had established a defensive posture to the southwest of Lynchburg, at a breastwork near the Quaker Meeting House, near Salem Turnpike.

 

Early and the 2ndCorps arrived in Lynchburg early in the afternoon on Friday, June 17. With the railroad tracks maimed in several places, transit from Charlottesville to Lynchburg took five hours. Until Early arrived, McCausland and Imboden had kept Hunter’s troops, over 10 times their number, in check, but were slowly being driven back. Even with Early’s troops, the Confederates were still at a distinct numerical disadvantage—some 8,000 to 10,000 Confederate soldiers to some 16,000 to 18,000 Union soldiers—so Early ran trains all night along the tracks on June 17 in an effort to convince the Union troops that still more Confederates were arriving. Hunter wrote in his diary, “During the night the trains on the different railroads were heard running without intermission, while repeated cheers and the beating of drums indicated the arrival of large bodies of troops in the town.”

 

Early’s ruse—to pretend that there were more Confederate soldiers than there were—worked. Hunter was convinced that he was fronted by superior numbers. Ammunition, he wrote in his diary, was also running short. After discussion of military affairs with colleagues from Sandusky, Hunter ordered an immediate withdrawal of Union troops on the night of June 18. Hunter later wrote Gen. Grant of his decision, “It had now become sufficiently evident that the enemy concentrated a force at least double the numerical strength of mine, and what added to the gravity of the situation was the fact that my troops had scarcely enough of ammunition left to sustain another well-contested battle.”

 

Early, the following morning, attempted to retrace the retreat of Union soldiers, and did so with some success, as he soon caught the rear guard of the retreating blue-coats and killed a number of them. Yet the Union soldiers did escape to Salem and eventually to the mountains of what is now West Virginia. Grant however had requested that Hunter, if in retreat, go toward Washington, where his troops could be of use in defense of the city. He chose a safer route, because, he said, of his dearth of ammunition.

 

Confederate Capt. Charles Blackford, who left behind a lengthy account of the battle that was published in 1901, challenged Hunter’s account of having inferior numbers and of being low on ammunition. Hunter knew that his numbers were superior—“he had scouts on both railroads and the country was filled with the vigilant spies who prided themselves on their cleverness”—and he was not low on ammunition. “It cannot be believed that a corps was short of ammunition which had been organized but a few weeks, a part only of which had been engaged at Piedmont, and which had fought no serious pitched battle, and the sheep, chickens, hogs and cattle they wantonly shot on their march could not have exhausted their supply. The corps would not have started had the ammunition been so scarce.” The Union, he maintained, was well-stocked with ammunition. He concluded that Hunter, more interested in campaigns where there was little chance of loss of life, was a coward.

 

There is meat in Blackford’s assertions. Hunter could readily have arrived in Lynchburg by June 16, when he would have faced only the convalescent guard, the Silver Grays, and the few other men available to Breckinridge. Had he done so, Lynchburg would have fallen. Yet he tarried in Lexington to pillage and burn needlessly. He was also slowed by the constant burning and plundering of houses along the way which caused pain and loss to Southerners uninvolved in the fight and nowise advanced the North’s cause.

 

Thus, one of the most significant cities for the fate of the South, Lynchburg, was not captured. Had Hunter arrived before Early and had he destroyed the railroad tracks and Kanawha Canal as he was ordered to do, the city would have fallen and the Civil War would likely have ended in 1864.

 

And so, the real savior of Lynchburg was neither Early nor McCausland, but Hunter, about whom, scholars are in agreement, had ambitions much larger than his abilities. Perhaps Hill City ought to erect a monument in a prominent place in his honor.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172526 https://historynewsnetwork.org/article/172526 0
What Does It Mean to Be Patriotic? Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

 

 

July 4 means that healthy and unhealthy discussions of patriotism again take center stage. Will you wave the flag?

 

I always welcome thoughtful conversations about what behavior is patriotic, about how we should act if we love our country. It’s too bad that this rarely happens. Face-to-face talks about patriotism usually begin as arguments and accusations, and then get worse. Among people of like minds, how to be patriotic is also seldom sincerely and frankly addressed. Maybe we all are afraid to discover that we don’t agree, or that our ideas can be easily criticized.

 

For example, the premise that all good Americans should love our country is a starting point that is never questioned. The postwar conservative refrain that liberals did not love America and wanted to betray it to the world communist movement has never abated, only taken different forms in different political eras. When I was growing up, it was crudely expressed as a taunt to antiwar protesters: “Love it or leave it.” Of course, no self-regarding conservative would now dare suggest that cozying up to post-Soviet Russia is unpatriotic, considering Trump’s attempts to excuse Putin’s electoral meddling. That has taken some of the sting out of the taunt that Democrats are socialists, but not so much that Republicans don’t use it every day.

 

Just a few years ago, Republicans howled that if a black man like Rev. Jeremiah Wright said, “God damn America, for treating our citizens as less than human”, and if our black President had ever listened to him, then the whole election of Obama was tainted by lack of patriotism.

 

Must a German Jew love her country? Could she not be a loyal citizen, but still experience other feelings besides love for Germany, even 70 years after the end of Nazi rule? Must a Russian whose grandparents were murdered in Stalin’s purges by the secret police now love a country run by the former KGB leader?

 

Must African-Americans who experienced discrimination on their own bodies now simply love America, when segregation and discrimination still exist, and when our President is an unrepentant racist? That’s just the beginning of a thoughtful confrontation with the meaning of patriotism.

 

A second problem with patriotism discussions is how they often are about symbols rather than behavior. In fact, conservatives and liberals agree about many political behaviors that should characterize a patriotic American: voting, paying taxes, and serving on juries. But conservatives tend to value reverence for symbols of America much more than liberals. In a survey last year, 71% of Republicans, but only 34% of Democrats said that knowing the Pledge of Allegiance was important for good citizenship. Displaying the flag was important for 50% of Republicans, but only 25% of Democrats.

 

Someone posted on Facebook the false claim that none of the 10 Democratic presidential candidates at the first debate wore flag pins, which was not true, and then concluded that “Democrats hate Americans and America”. That is a familiar refrain from the right wing.

 

Another difference between partisans is how criticisms of one’s country are regarded. While half of Democrats think a good citizen should protest when the government does something that is wrong, that is true for only a third of Republicans. Conservatives have argued my entire lifetime that criticisms of America and American history are equivalent to treason. That’s the position that conservatives defended when protests came mainly from liberals during the 1960s and 1970s. Now that much protest comes from the right about “over-regulation” or investigations of Trump, conservative protest has become legitimate. For them it’s fine, that candidate and President Trump can display patriotism by offering wide-ranging criticisms of America: our President was illegitimate; our airports were “third-world”; our FBI committed treason; our military leaders are ignorant. Trump became the epitome of conservative patriotism, not out of any principles about what patriotism means, but from pure partisanship.

 

Some Republican “principles” are defended only when convenient. 79% of Republicans said that good citizens “always follow the law”, compared to 61% of Democrats, but Trump’s multiple legal transgressions are ignored or defended.

 

Whatever the thinking behind the idea of patriotism, Republicans believe theirs is the right way. A survey one year ago showed that 72% of Republicans rated themselves “very patriotic”, while only 29% of Democrats chose this label for themselves. Since Trump’s election, the self-proclaimed patriotism of Democrats has dropped significantly.

 

It turns out that patriotism refers both to long-term feelings about country and more temporary feelings about current political leadership. Behavior and symbols are both important, but to different people. Political differences lead too often to claims that the other side is not just wrong, but also unpatriotic.

 

Because patriotism is about feelings, it is hard to analyze, even for oneself. The American women just won the soccer World Cup. I rooted for them all the way, just the way I root for American athletes I never heard of in the Olympic Games or at Wimbledon. I don’t think that makes me a better American, just a normal one.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/blog/154227 https://historynewsnetwork.org/blog/154227 0
The 2020 Election Presents a Unique Opportunity to Elect a “New Generation of Leadership”

 

The 2020 election presents a unique opportunity to elect a “new generation of leadership” to the presidency. The American public has done so before, as represented by John F. Kennedy in 1960; Jimmy Carter in 1976; Bill Clinton in 1992; and Barack Obama in 2008.

 

One way to elect a “new generation of leadership” is by electing a younger president. Such would be the case with Pete Buttigieg, who would be 39 at the time of inauguration; Tulsi Gabbard, 39; Seth Moulton, 42; Julian Castro, 46; Tim Ryan, 47; Beto O’Rourke, 48; Cory Booker, 51; Steve Bullock, 54; Kirsten Gillibrand, 54; Kamala Harris, 56; Michael Bennet, 56; John Delaney, 58; Bill de Blasio, 59; or Amy Klobuchar, 60.

 

When one examines modern American political history, one discovers that traditionally, the Democratic Party regularly has much younger Presidential nominees than the Republicans.

 

The average age of all Presidents is about 55, but since 1952, with two exceptions, all of the Democratic presidential nominees have been younger than 60 years old. As exceptions, John Kerry was 61 when he ran for President in 2004 and Hillary Clinton was 69 in 2016. In chronological order, the Democratic nominees were: Adlai Stevenson, age 52 and 56; John F. Kennedy, 43; Lyndon B. Johnson, full term, 56; Hubert Humphrey, 57; George McGovern, 50; Jimmy Carter, 52 and 56; Walter Mondale, 56; Michael Dukakis, 56; Bill Clinton, 46 and 50; Al Gore, 52; Barack Obama, 47 and 51. 

 

The Republican nominees have generally been older: Dwight D. Eisenhower, age 62 and 66; Gerald Ford, 63 when running for full term; Ronald Reagan, 69 and 73; George H. W. Bush, 64 and 68; Bob Dole, 73; John McCain, 72; Mitt Romney, 65; Donald Trump, 70. The only exceptions were Richard Nixon, 47, 55 and 59; Barry Goldwater, 55; and George W. Bush, age 54 and 58.

 

So if the Democrats nominate Bernie Sanders, 79 at the time of inauguration; Joe Biden, 78; Elizabeth Warren, 71; Jay Inslee, 69; or John Hickenlooper, 68; they would alter a historical pattern. 

 

In the past, there has often been a wide age gap between the two presidential candidates, as with Gerald Ford and Jimmy Carter in 1976 (11 years); Ronald Reagan and Jimmy Carter in 1980 (13 years); Ronald Reagan and Walter Mondale in 1984 (17 years); George H. W. Bush and Bill Clinton in 1992 (22 years); Bob Dole and Bill Clinton in 1996 (23 years); John McCain and Barack Obama in 2008 (25 years); and Mitt Romney and Barack Obama in 2012 (14 years).

 

Now in 2020, we could have a much wider divergence in age—as much as 36 years between Donald Trump and Pete Buttigieg.

 

2020 could be a “revolutionary” and unique election year beyond the issue of age. We could possibly elect the first woman President (Warren, Harris, Klobuchar, Gillibrand, Gabbard); our first mixed race woman President (Harris); our second African American male President (Booker); our first Latino President (Castro); our first gay President (Buttigieg); our first Jewish President (Sanders, Bennet); our first Hindu President (Gabbard), born in the US territory of American Samoa; our oldest first term President at inauguration (Sanders, Biden, Warren); our first President who will reach 80 years of age in office (Sanders, Biden); our first sitting Mayor President (Buttigieg, de Blasio); our first sitting Congressman President since James A. Garfield in 1880 (Gabbard, Moulton, Ryan); or a President younger than Theodore Roosevelt or John F. Kennedy (Buttigieg, Gabbard, Moulton).

 

Why is this important for the upcoming election?  The answer is that “fresh blood,” whether age, gender, ethnicity, or sexual orientation, represents the long-term future of America, as the nation becomes more diverse than it has ever been. Promoting change and uniqueness in political leadership could result in higher voter turnout and potentially would enhance efforts to address the challenges of the 21st century. Historically, this  occurred in the early to mid 20th century with the era of Theodore Roosevelt, Woodrow Wilson, Franklin D. Roosevelt and Harry Truman.  

The future is ultimately in the hands of those born since 1980 who will lead America in the next few decades.  Despite the strength at the moment of leaders born in the World War II and early Cold War years, the long range future suggests the “torch should pass to a new generation of leadership,” as California Congressman Eric Swalwell stated, quoting John F. Kennedy’s Inaugural Address, in the first debate in late June (although, Swalwell dropped out of the race July 9th). The same situation occurred when Jimmy Carter, Bill Clinton, and Barack Obama took the oath of office, and it is likely that the same will occur in 2020.

 

Most certainly, the Presidential Election of 2020 will be one of the most fascinating and significant elections in American history.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172492 https://historynewsnetwork.org/article/172492 0
Roundup Top 10!  

Only Washington Can Solve the Nation’s Housing Crisis

by Lizabeth Cohen

The federal government once promised to provide homes for every American. What happened?

 

Democrats’ Ominous Shift on School Segregation

by Brett Gadsden

It’s not just Joe Biden—the party has backed away from its commitment to fighting segregation in the public schools.

 

 

How antitrust laws can save Silicon Valley — without breaking up the tech giants

by Margaret O'Mara

For AT&T in the 1950s, antitrust enforcement helped increase competition while keeping Ma Bell intact.

 

 

How Fake News Could Lead to Real War

by Daniel Benjamin and Steven Simon

We think of false information as a domestic problem. It’s much more dangerous than that.

 

 

The War Against Endless War Heats Up With Koch-Soros Salvo

by Ronald Radosh

The otherwise ideologically opposed billionaires are the latest unlikely pair to find common ground in the idea that American power is the root cause of the world’s problems.

 

 

The Riptide of American Militarism

by William Astore

As Americans wrestled with the possibility of finding themselves in a second looming world war, what advice did the CFR have for then-President Franklin Delano Roosevelt in 1940?

 

 

The white nostalgia fueling the ‘Little Mermaid’ backlash

by Brooke Newman

The uproar over a black Ariel shows how important representation in children’s entertainment is.

 

 

There’s More to Castro Than Meets the Eye

by Jonathan M. Hansen

The revolutionary leader fought for and defended the very democratic ideals his government would later suspend.

 

 

Roosevelt versus the refugees: One FDR policy that Bernie Sanders never mentions

by Rafael Medoff

Sanders favors a much more liberal U.S. immigration policy. Not Roosevelt. In fact, FDR’s immigration policy was so strict that if Sanders’s father, Eli, had not arrived from Poland before Roosevelt became president, he probably would not have been admitted.

 

 

Why We Need More Black Women In Economics

by Keri Leigh Merritt

Recently a group of brilliant, driven, young Black women formed The Sadie Collective, an organization that “seeks to be an answer to the dismal representation of Black women in the quantitatively demanding fields such as public policy, economics, data analytics, and finance.”

</

 

What to an American Is the Fourth of July?

by Ibram X. Kendi

Power comes before freedom, not the other way around.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172517 https://historynewsnetwork.org/article/172517 0
DJ Alan Freed: The Inventor of Rock and Roll

 

The “juke box musical” is a term that describes plays about an individual or group in rock music history in which there are a lot of songs but a thin plot and undeveloped characters. Most of them fail. Some, like The Jersey Boys, Beautiful, and Ain’t Too Proud, succeed.

 

Rock and Roll Man: the Alan Freed Story, the tale of the fabled 1950s DJ who coined the term "rock and roll” and was the most famous DJ in America until the arrival of Dick Clark, falls somewhere in the middle. The play, with book by Gary Kupper, Larry Marshak and Rose Caiola, and music and lyrics by Kupper, just opened at the Colonial Theatre, part of the Berkshire Theater Festival, in Pittsfield, Massachusetts. It is a rocking and bopping night of wop-bop-a-doo-wop rock music entertainment, loaded with fabulous songs from the early days of rock, such as those by Chuck Berry, Little Richard, the Coasters and the Platters, and full of brilliant choreography.

 

The play has eye-opening staging. There is a turntable on stage that spins about in the first act, a high courthouse bench for the play’s judge, a second level balcony on which performers delight the audience, recreated music studios, night clubs and bars. It is a music city of a stage. It is, for rock fans a rip-roaring good night at the theater.

 

The play has a flashback format. In its beginning, Alan Freed emerges as a troubled drunk as the payola scandal (bribes to DJs to play particular songs) hits America. Then we go back to a mythical court drama in which Freed is on trial (the court of public opinion). His defense attorney is the colorful, flamboyant singer Little Richard, played with all of his pomposity and wildness by Richard Crandle, whose costume looks like an exploding candle. The prosecutor is FBI Director J. Edgar Hoover. The court then recounts Freed’s life from his early days as a DJ at a small radio station in Cleveland to his position as the number one DJ in America at New York City’s WINS radio in the early 1960s.

 

Director Randal Myler does a pretty good job of holding on to the reigns of a play that is a bit cumbersome. He gets fine performances from Crandle as Little Richard and a very talented ensemble of actors, singers and dancers, plus numerous famous quartets.

 

The play has some problems that hurt it, though. The opening act is overloaded with songs and underdeveloped with plot. It appears to be more of a music review – the best moments of the 1950s, everybody please clap. You are overwhelmed with song after song. Dizziness sets in. The tunes are good, but they knock you over in your seat. The story of Freed, who was such a music figure in that era, emerges very slowly and Alan Campbell, the actor who plays Freed, never quite gets going in his role. Campbell himself rambles throughout the play. Campbell presents Freed as more of a bystander in a story and not the tenacious Freed himself. He needs a sharper focus and more pizzazz. George Wendt, the co-star of the famed television series Cheers, has the same problem. He plays J. Edgar Hoover. Wendt never captures the character of Hoover. Wendt is a bit miscast in the role. The play is also too long. It runs about two and a half hours and a good twenty minutes could be cut, especially in the first act. Many songs overlap each other and make the same plot point and could be dropped.

 

The story needs to be sharper. There are points in it, such as the African American singer Frankie Lymon kissing a white girl ln TV, that were very controversial and they are glossed over in the story. Freed’s deadly alcoholism is mentioned several times in the play but you never expect drink to ensnare him. That needs to be strengthened. The payola scandal, that stunned the country and brought down several well-known DJs and grabbed headlines for weeks, needs to explained better.

 

Overall, though, Rock and Roll Man is a good show. It is a treasure house of entertainment history and the payola scandal. You learn just about all there is to know about how Freed emerged, caught the public eye and became the number one DJ in America. You learn how DJs on radio worked, how they moved to television (Dick Clark) and how rock and roll, so feared by the police, parents and the schools, literally took over America. You learn much about the attitude of teenagers in that day (accused of being juvenile delinquents by just about everybody. There is a lot on how records became number one best sellers, the integration of music and America and the development of the rock and roll concert, a rarity in American entertainment history.

 

And, of course, the show gives you dozens of classic songs, real finger-snappers, from the early days of rock and roll, - Good Golly Miss Molly, Great Balls of Fire, I’m Walkin’, Lucille, Maybelline and Roll Over Beethoven, to name a few.

 

So, try to ignore the weak spots in the show, put on your blue suede shoes and jitterbug out on to the dance floor.

 

Rock and roll is here to stay…

 

PRODUCTION: The show is produced by the Berkshire Theater Festival. Scenic Design:  Tim Mackabee, Costumes: Leon Dobkowski, Lighting: Matthew RIchards, Sound: Nathan Leigh. Choreography: Brian Reeder. Director: Randal Myler. The play runs through July 21.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172494 https://historynewsnetwork.org/article/172494 0
Jimmy Carter, Public Historian

Jimmy Carter Sunday School, February 3, 2019, photo by Jill Stuckey

 

 

Those who have attended Jimmy Carter's Sunday School know that the time between arrival and 10:00 a.m.—measured in hours—is not really empty. A church member, often Jana Carter or Jill Stuckey, orients visitors, goes through a list of do's and don'ts, and provides short history lessons along the way. It is a presentation that church members have perfected over the years and the orientation is interactive and lively. "What have the Carters been doing in retirement?" Jill asks the audience. It is a loaded question. "Habitat!" or "Building houses!" is the most common first response. "Yes, the Carters build houses one week a year," Jill responds, smiling through gritted teeth. "How about the other fifty-one weeks?" Soon a more comprehensive accounting emerges: helping to ensure fair elections; eradicating Dracunculiasis or "guinea worm"; writing books; staying in shape; hunting and fishing. One avocation, though quite successful, is never mentioned: Jimmy and Rosalynn Carter are practicing public historians.

 

Jimmy who? A public what? The work that the Carters do at the local level meets the National Council on Public History's inclusive definition of the field: "public history describes the many and diverse ways in which history is put to work in the world." Consider, for instance, Sunday school itself. The history lesson begins before President Carter arrives and continues after he enters. Political Scientist Jason Berggren writes that Sunday school at Maranatha Baptist Church in Plains, Georgia, serves "as a press conference of sorts" and "an occasion for presidential apologetics – an ongoing defense and explanation of his presidency."(1) The setting also makes it a form of wide-ranging historic site interpretation that includes Carter's upbringing, his years in the White House, his post-presidency and— last but key to it all—his deep, compelling faith.

 

Sunday school at Maranatha is not a political rally, but it has a secular significance as public history. Guests often come from around the world to connect with the past—a need that drives much of the public history world from heritage tourism to reenactments. President Carter serves as both historical subject and docent in these moments. "Where are you from?" he asks the audience. When someone says, "Washington State," Carter responds, "The best nuclear-powered submarine in the world is stationed there. I'll let you guess the name." Carter is referring to a ship that bears his name—a subtle reminder of his history with the U.S. Navy. At the end of Sunday school, Carter shifts fully into public historian mode, shaping the way he wants people to remember him. "I used to say I'd be happy to take photographs with you after church," he jokes with a smile. "Now, I'm willingto do them." Or, with a smaller smile, he apologizes about his waning mobility. "My doctor tells me I have to sit during photographs. Please don't take my sitting to mean that I think I'm better than you."

 

 

Jimmy and Rosalynn Carter speak at Plains High School for President's Day, 2016, photo by Jill Stuckey

 

 

Jimmy Carter continues to shape his historical legacy in others ways. Indeed, to visit Plains High School, the railroad depot that served as a presidential campaign headquarters, or the Boyhood Farm is to take a guided tour led by public historian Jimmy Carter. Although Carter has written more than thirty books, his favorite to write was An Hour before Daylight: Memoirs of a Rural Boyhood (2001). Published just as his boyhood farm opened to visitors, this book has shaped interpretation at the Jimmy Carter National Historic Site more than any other. In fact, it would be difficult to overemphasize the impact of the book at the farm. According to historian Zachary Lechner, "Carter's perspective—very much in evidence throughout the site's interpretation—is omnipresent at the farm."(2) From Jimmy Carter voiceovers to written excerpts, the boyhood farm is nearly as immersive an experience as Maranatha. There are few autobiographical landscapes quite like it.

 

Both Jimmy and Rosalynn Carter also remain active members of the public history community, including the Plains Better Hometown Program and the Friends of Jimmy Carter National Historic Site. In December 2016, one year after President Carter beat cancer, the Better Hometown Program held a Christmas Party in the Matthew Rylander House. Better known locally as "the haunted house," the (ca. 1850) plantation house was rented by the Carter family between 1956 and 1961. Although now vacant, it is owned by the Better Hometown Program, and the Carters led the way in stabilizing the building. The night of the party, with a torrential storm outside and only Christmas lights inside, the Carters went to each table after dinner, describing what used to be here or there and pointing out "hidden" rooms between first-floor closets and an attic. The family thought these nooks were the source of house's haunting. The Carters also make recurring cameo appearances in a "whodunit" murder mystery series organized by Kim Fuller, Director of the Friends of Jimmy Carter NHS. The popular event is held on the SAM Shortline, an excursion train between Cordele and Plains that the Carters lobbied to bring here in 2000.

 

 

Carter painting door at the Plains High School, 2015, photo by Jill Stuckey

 

 

As board members of the Friends, the Carters play key roles from interpretive work to fundraising. Rosalynn Carter recently led the way in putting Plains on Georgia's Camellia Trial. In 2016, the National Park Service made President Carter an honorary park ranger and the Carters have given special programs on President's Day for years. Together, they have helped the Friends group raise millions of dollars for the park. This fundraising has enabled the organization to hire a full-time education specialist who creates museum lesson plans and coordinates field trips. In some ways, the Carters have created a living history museum of 20th Century rural America in Plains with a twist: the global perspective of a former president and first lady.

 

Orientation before Sunday school at Maranatha makes it clear that Jimmy Carter still thinks carefully about official and ceremonial titles. He does not like to be referred to as "Mr. President," an orientation leader explains, "because there is only one 'Mr. President' at a time, and that is the person who occupies the Oval Office." So be it. Indeed, we ought to respect President Carter’s wish that we not confine him too much to his time as chief executive. So address him as President Carter, or describe him by one of his less formal titles and roles: Sunday school teacher, public health leader, navy veteran, compassionate Christian, and a friend to strangers. And to these sobriquets, each more descriptive than “Mr. President,” we should include Jimmy Carter, public historian.

 

Carter becomes honorary park ranger, 2017, photo by Jill Stuckey

 

(1) D. Jason Berggren, "Life after the Presidency: Jimmy Carter as Sunday School Teacher," White House Studies, vol. 13, no. 2 (2015), 111, 109-127.

(2) Zachary J. Lechner, "Commemorating Jimmy Carter and Southern Rural Life in Plains, Georgia," in Born in the U.S.A.: Birth, Commemoration, and American Public Memory, edited by Seth. C. Bruggeman (Amherst: University of Massachusetts Press, 2012), 83.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172464 https://historynewsnetwork.org/article/172464 0
Fictional History, Patriotism, and the Fight for Scottish Independence

A screenshot from Braveheart (1995) 

 

Premiering at the 2019 Edinburgh International Film Festival in advance of its general release on June 28, 2019, Robert the Bruce, directed by Richard Gray, will “boost support for Scottish independence,” if actor and independence activist Angus Macfadyen has his way. Macfadyen revisits his role as the titular Scottish leader, a role he first played in Braveheart (1995). That film, he believes, “led to a surge in Scottish nationalist confidence.” Coincidentally, within a few days of the premier of Robert the Bruce in Scotland, former UK prime minister Gordon Brown warned that “the unity of the United Kingdom has never been at greater risk,” due to the “hijacking of patriotism” by Conservative Party leaders and Brexit bulldogs Boris Johnson and Nigel Farage, and by the Scottish National party’s embrace of “a more extreme nationalism.” 

 

To many, the contribution of popular but historically inaccurate films—and literature—to Brexit and the evolution of a misguided patriotism that fails to take account of historical and political complexities seems obvious. Perhaps even more disturbing, however, is the synergy between politics, popular culture, and economics: as promoters of Scotland as a tourist destination continue to embrace “tartan heritage” in an effort to support Scotland’s important tourist industry, they unwittingly reinforce a version of history that serves the purpose of political propaganda, rather than disseminating a nuanced understanding of Scotland’s past. 

 

The case for Braveheart’s influence on Scottish politics has been made previously by other observers, including historian Robert Brent Toplin, who noted in a 2015 History News Network article that Scottish audiences gave the film standing ovations at screenings and began supporting the separatist movement in far greater numbers after its appearance. Toplin concluded that “Braveheart’s impact on the people of Scotland reveals the potential of film to shape public opinion and agitate national politics.” It’s important to keep in mind that films such as Braveheart and Robert the Bruce, and recent books such as Diana Gabaldon’s Outlander series with its “wildly popular” Starz adaptation, are building upon a romantic vision of Scotland developed by eighteenth-century writers and Romantic visual artists and codified by the poetry and novels of Sir Walter Scott in the nineteenth century: their creative workestablished romantic Jacobitism as a dominant narrative of Scotland’s past. This narrative of history fostered the idea of Scotland as an “imagined community,” to use Benedict Anderson’s phrase, associated with a heroic but doomed rebellion against an indifferent, often unjust overlord, or, as it evolved over time, patriotic Scots against the cruel English colonizer. When the contemporary American novelist Diana Gabaldon came to choose the subject for her first novel, she tapped into a historical master-narrative of Scotland that already had an established set of associations and cultural values influenced by fiction. 

 

 

 

 

The truth of Scotland’s history is, of course, much more complex than the narrative of the past one finds in the realm of popular culture. Romantic artists erased the Gaelic population by visualizing Scotland as a picturesque landscape, sublime and largely empty of people, despite the presence of industry throughout the country in the eighteenth century and the rapid urbanization of Edinburgh and Glasgow. Romanticism’s promotion of Gaelic primitivism, now popularized by contemporary literature and film, has also overwritten the significant global contributions made by Scottish Enlightenment philosophers, statesmen, scientists, and innovators. Popular stories of the Jacobite Rising of 1745  typically narrate a conflict between heroic Highlanders and a better-equipped English army that overlooks the military successes and subsequent poor military decisions of Prince Charles’s army, as well as the presence of many Scots who fought and died alongside the English at Culloden. The history of the Highland Clearances is similarly more complicated than a nationalist narrative of ethnic cleansing by the English suggests. As author Madeleine Bunting has observed in her memoir Love of Country: A Journey through the Hebrides, “Racism, betrayal [by fellow Scots], and imperial exploitation: three toxic elements have been incorporated into different readings of the Clearances” (147).

 

The fictional “history” of Scotland has and continues to receive reinforcement via the consumer website of Scotland’s national tourist board, which seeks to capitalize on the popularity of Braveheart and, now, Outlander by invoking that romantic narrative as it entices visitors and their pocketbooks to Scotland. In fact, just as the nineteenth-century tourist industry drew upon the popularity of Scott’s works to inspire readers to visit the locations he made famous, promoters of tourism today are quick to invite fans of Outlander to experience a version of Scotland that exists largely within the realm of the imaginary. 

 

One may ask why this matters: if fan tourism brings much needed money into the country, does it matter if those tourists are ill informed about history, so long as the inhabitants of the country know better? If historical fiction had no effect upon its citizens’ perceptions and political decision-making, the oversimplification of Scotland’s history by novelists and filmmakers in quest of a good story—and the reinforcement of that story by those seeking economic gain—would not matter. But, as noted above, fiction does inform life, in the case of Scotland’s independence movement: the popular story of Scotland told across print and media platforms, on screen, in books, and on websites, has become, for many Scots, the only story of their past known by those who get their history from popular culture.

 

Comments about Culloden made by members of the popular Facebook page Outlander Series Books & TV reveal that this series has constructed the history that some believe is true. As one member commented, “Scotland is where I was born and raised. . . . I never knew anything about the battle of Culloden until I watched outlander [sic].” Pop culture derived “history” has been similarly on display during Scottish independence rallies since the 2014 referendum. Reporting on a 2015 rally in Glasgow, VICE correspondent Liam Turbett noted the expression of “dodgy pseudo-ethnic nationalism” which, while it resembled “a parody of everything people say to discredit the independence movement,” was cheered by those “along the fringes of the Yes movement.” Turbett supplemented his verdict of this “contortion of history” with a mention of a pro-Independence sign containing a quote attributed to William Wallace—but really made by “his fictional dad in the film Braveheart.” 

 

Whose responsibility is it to ensure that a more nuanced understanding of history is shared widely, especially among those who may lack the interest in or ability to access the scholarship of historians? The example of Scotland and the forces unleashed by Brexit and the current nationalist debate illuminate the importance of understanding how commercial and political entities use pseudo-historical narrative for self-promotion and the creation of an imagined community. However, it may be as important for serious writers and filmmakers to create historical fiction more thoughtfully. Knowing that literature and film can shape public opinion and beliefs about the past, writers and readers who crave a better-informed populace may need more often to use the power of the pen to avert the power of the sword.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172467 https://historynewsnetwork.org/article/172467 0
Gresham's Law of Reading: Bad Reading Drives Out Good

 

James W. Loewen is a sociologist.  The New Press recently brought out new paperbacks of Loewen's bestseller, Lies My Teacher Told Me, and Sundown Towns, about places that were/are all-white on purpose. 

 

Gresham's Law, as I'm sure you recall from Econ. 101, states, "Bad currency drives out good." It works like this. Suppose you have $100 in gold coins and $100 in paper bills. You want to buy a sport coat for $99. (I did buy a sport coat for $99, just before Christmas.) Are you going to hand over your gold coins or your paper bills? 

 

You're going to hand over your paper bills. At least most of us will.

 

After all, the paper bills depend upon the backing of the government. The gold coins have intrinsic value. If North Korea or an ISIS terrorist sets off a nuclear bomb in D.C., where I live, I can escape in my car, camp out in southern Pennsylvania, and maybe trade a gold coin for some bread and cheese from the nearest Amish farmer. Even without the threat of societal breakdown, the gold coins also look nice, so I derive pleasure from merely owning them. From the paper, not so much. 

 

As a result, gold coins don't work as currency. People don't exchange them. They hoard them. By definition, "currency" is "a medium of exchange." Bad money has driven out good. 

 

So it goes with reading, at least for me. My current fiction read is Cloud Atlas, a complex remarkable novel by David Mitchell that takes place in 1841, 1931, more-or-less the present, and several future eras. I recommend it to you. 

 

I've been reading it for years. First, I used it as bedtime reading. This didn't work, because to the annoyance of my spouse, I fall asleep within 30 seconds of opening it. Then I switched to taking it on trips with me. 

 

Cloud Atlas has now been to, in chronological order, West Virginia, Indiana, Colorado, Montana, Minnesota, Georgia, California, Wisconsin, Philadelphia, New York City, Switzerland-to-Amsterdam on the Rhine, the United Kingdom, the Bahamas, New York City again, Vermont (twice), and Massachusetts (three times). A year ago it visited the Azores (which were excellent, by the way). This past April, it went down the Nile (a bucket-list trip, fascinating in many ways). Just last month, it ventured to Portland, Oregon, and then to Minnesota. Still, I didn't finish it.  

 

What is going on? 

 

It's Gresham's Law of Reading. Bad reading drives out good. 

 

Specifically, it's the newspaper, in my case, the Washington Post. It's Time, Smithsonian, and Multicultural Perspectives. It's The National Museum of the American Indian. (Yes, that's a magazine as well as the institution that puts it out.) God help me, it's AARP the Magazine and whatever the magazine is called that AAA sends me. I am always behind on reading them, so I always pack a stack of them on my trips. Since I don't want to bring them back home, I always read them first, so I can throw them out. Consequently I rarely get to the gold. 

 

This pattern does have one payoff: I do catch up on my magazines. This saves me from the fate of a Time subscriber whose letter I still recall from about 1952, when I was ten years old, reading my father's magazine. From memory, it went, 

 

I really like your magazine. You're doing a fine job. However, it is too much material for me. I file each new issue on my bookshelf on the right, and I read them from the left. Right now I'm in the middle of 1943, and I can't wait to see how it all turns out!

 

On my last day on earth, however, I shall be sad if I have not finished Cloud Atlas. I doubt I'll lament not having finished the latest AARP. 

 

Could this perhaps be a metaphor? On that day, might I also be sad, not having taken care of the important things — the gold — while wasting my time on tasks that have currency, but no real value? 

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/blog/154226 https://historynewsnetwork.org/blog/154226 0
Flight Girls: Remembering World War 2's Women Airforce Service Pilots

 

What is a hero? It’s a question I’ve pondered off and on for the past seven years, ever since I came across a stack of books at my aunt’s house and read a piece of WWII history I hadn’t previously known. 

 

The Women Airforce Service Pilots program (WASP), was the brainchild of famed aviatrixes Jacqueline Cochran and Nancy Harkness Love and—with the assistance of General Henry “Hap” Arnold, the commanding general of the Army Air Forces—they built a program teaching female pilots to fly every type of airplane the military owned, so long as they met the age and height requirements, had 500 flying hours under their belt each, and a pilot’s license in hand. They were taught to fly “the Army way” and flew warplanes that had been damaged in battle, planes right off the production line, simulated strafing missions, and towed gunnery targets for live ammunition training. The women who flew were bound by spirit and duty, bravery and skill… and bonded by their love of country and a job they knew they could do well. Some say they could handle those planes better than many of the men.

 

If they washed out, they had to pay their own way home. If they were injured or killed, it was up to their friends and family to get them the care, or the casket, they needed. 

 

Once training was finished, they were sent to one of the many military bases across the country where they ferried planes from base to base, transported military personnel and cargo, or continued testing new planes. There wasn’t always a designated space for them to bunk, so sometimes they slept in the nurse’s quarters. Other times they had to get a hotel room. No plane to fly back to the base you just landed at? No problem! Wait around for a day or more, or get yourself a ticket on a commercial flight – on your own dime of course. They weren’t allowed to pack much in the way of clothing—warplanes don’t always have a lot of room for luggage—so they tucked spare bits of clothing in the cockpits’ nooks and crannies. There were undergarments in logbooks, a pair of heels beside their seat. Sometimes they got stuck in a city for days, washing and re-washing the few items of clothing they’d brought until they could get back to their home base. 

 

They did this without complaint or expectation. They did this so the men could go to war.

 

I was stunned by anecdotes of bravery, death and outright misogyny. And I was baffled the subjects of these stories had tried to be heard, but still, seventy-seven years later, for the most part were unknown to the greater public. 

 

On a humid and windy May morning I arrived at what is now the Texas State Technical College. Seven-plus decades ago though, in place of the brick buildings, stood long wooden structures that housed the pilots that trained here. There were offices and a chow hall, classrooms, and hangars. Boots marched on this dirt. Planes buzzed overhead at all hours of the day and night in the wide-open blue sky. This had been Avenger Field. And in 1942 - 1944, 1,074 women served their country with bravery and a whole lot of moxie. 

 

What brought me there was the annual WASP Homecoming Reunion. I had heard there would be five members attending. Only two were able to make the trip. Kay Hildebrand and Dorothy Lucas were greeted with a salute and escorted from their cars by service women and men, who then rolled them in their respective wheelchairs between two walls bearing their comrades’ names and helped them onto the low brick wall that encircled a wishing well- the same one they’d jumped in when they’d graduated the program so many years before, and where they sat now, smiling at their admiring crowd. 

 

The faces smiling back were both young and old. Some women wore outfits of an era gone by, their hair in Gibson Rolls, their lips painted red. There was one dressed as Rosie the Riveter and a young girl named Jenna sporting a pilot’s costume, goggles perched upon her little head. There were family members and fans, and there were the women who came after. Women who may never have had the chance to wear an Air Force uniform if not for the two women by the fountain. 

 

Those two women – representing the 1,074 who served. They did not fight in Pearl Harbor. They didn’t storm the beach of Normandy. They didn’t serve in the Pacific or stand on the front lines of any battle. They never stared down the barrel of a rifle, waiting to plunge a bullet into a Nazi soldier racing to try and land his shot first. 

 

But they did serve their country. They served at home, on American soil. They served without military status or benefits. Without expectation or praise.

 

These are the women history forgot.

 

Let me rephrase.

 

These are the women erased from history.

 

Do they not deserve recognition purely because they weren’t allowed to step foot on a front line? Or drop a bomb from thousands of feet in the air? Or sit in the ball turret of a B-17 discharging a machine gun?

 

They signed up with no chance at being promoted. No raise in their future. No contract stating they’d be taken care of. And they did it with pride… and barely a thank you in return. 

 

The WASP program ended on December 20, 1944. The women, with some exceptions, were responsible (of course) for getting themselves home. If they wanted to have a career flying, they’d have to find it elsewhere. They were no longer invited to fly the military’s aircraft. 

 

And that was it. The file on them was sealed and for thirty-five years there was not a peep about what these birds of war had done. How they had stood up to serve their country – and how their country disserved them. 

 

In 1977, after much debate between the Veterans Administration and the Department of Defense – the former against, the latter in favor of—the WASP were finally given veteran status and President Jimmy Carter signed it into law in November 23rd of that year. On March 10, 2010, President Barack Obama awarded the WASP with the Congressional Gold Medal.

 

And yet, why do so few know about these fearless flyers STILL? Why is their story not being added to curriculums across the country? Being taught in elementary schools, high schools, colleges? They are barely a blip on the History Channel’s website. I almost fell off the sofa when Josh Gates from Expedition Unknown went in search of Gertrude “Tommy” Tompkins, a WASP who went missing after taking off from an airfield decades ago. I shouldn’t be surprised to see these stories. It should be a given that ALL those who served have their stories told. There should be fieldtrips to the National WASP Museum. There should be mentions and parades and films! 

 

I stood watching the two remaining WASPs sitting at that wishing well. When I’d first learned of their service, I was astonished and outraged I hadn’t known before. I hadn’t expected that, since that day eight years ago, I would fall in love with their stories. That I would write a book inspired by them. That I would become one of their biggest fans and greatest champions.

 

What is a hero? By definition a hero is a person admired for courage, outstanding achievements, or noble qualities.

 

I think the WASP fit that bill.  They are certainly my heroes.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172466 https://historynewsnetwork.org/article/172466 0
Report on the National Strategy Meeting of Historians Convened by Historians for Peace and Justice

 

Close to fifty historians attended the day-long National Strategy Meeting of Historians at Columbia University on May 28, 2019. Historians for Peace and Justice (H-PAD) convened the meeting.  The unprecedented gathering of historians independent of a formal conference testified to the urgent need many of us feel to continue and expand our opposition to the Trump regime as well as the multiple crises that confront us in this country and around the world. Thanks to the efforts, enthusiasm, and contributions of a number of historians, the meeting was stimulating, congenial, and successful, despite being organized on a shoe string budget. For the list of attendees and agenda, go to https://www.historiansforpeace.org/national-strategy-meeting-of-historians/.

 

Van Gosse and Margaret Power opened the meeting, which then broke into small groups to discuss several questions. Some of the questions were (1) What is the role of historians in this time of acute global and national crises? (2) How can we go forward together, forging stronger alliances and connections nationally and locally with each other, as engaged scholars and with the larger movements? (3) How important is it to act within our profession, including its associations? The body reconvened and a representative from each group reported on the main points it had explored. No clear consensus emerged from the report; instead a wide-range of opinions and priorities were expressed.

 

In the afternoon, people attended one of six work groups, based on what they wanted to work on. The six working groups and conveners that emerged and are currently functioning are the following: 

Direct Action/Combatting the Right’s Fake News, Contact:  Jeremy Varon, jvaron@aol.com

Empire and War, Contact: Prasannan Parthasarathi, prasannan.parthasarathi@bc.edu

K-12, Contact: Barbara Winslow, bwpurplewins@gmail.com]

Democratize the Academy/Smash the Carceral State, Contact:  Andy Battle, andrew.battle@gmail.com 

Palestine, Contact:  Leena Dallasheh, leena.dallasheh@gmail.com

Immigrants’ Rights, Contact Alex Avina, Alexander.Avina@asu.edu, and Margaret Power, marmacpower1@gmail.com 

 

If you are interested in finding out more about the groups or in joining one of them, please contact the convener listed above. H-PAD hopes that other working groups will also form, so if you are interested in forming or participating in one, please contact us and we will announce them and put people with similar interests in contact with each other.

 

The group also discussed whether to form a new organization to incorporate all the non-H-PAD people in attendance or whether to continue and expand H-PAD. Participants overwhelmingly decided it would serve no purpose to start a new organization and that those who wanted to get involved in the work should join H-PAD. For more information, go to https://www.historiansforpeace.org/

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172465 https://historynewsnetwork.org/article/172465 0
The End of Humanitarian Intervention? A Debate at the Oxford Union With Historian David Gibbs and Michael Chertoff

 

The issue of humanitarian intervention has proven a vexing one of the political left during the post-Cold War era. In light mass violence in Rwanda, Bosnia-Herzegovina, Kosovo, Darfur, Libya, and Syria, many leftists abandoned their traditional opposition to militarism and argued for robust military intervention by the United States and its allies to alleviate these crises. Critics argued in response that interventionism would end up worsening the very crises it was supposed to resolve. These issues were recently debated at the Oxford Union Society at Oxford University on March 4, 2019. The participants were Michael Chertoff -- former Secretary of Homeland Security during the presidency of George W. Bush and coauthor of the USA Patriot Act – who presented a qualified defense of humanitarian intervention; and myself, who argued against the practice. 

 

In past years, when I debated this issue, I was struck by the sense of almost religious zeal that characterized advocacy for interventionism. “We have to do something!” was the standard refrain. Those who offered criticisms – including myself -- were cast as amoral heretics. However, the repeated failures of interventionism that I note below have taken their toll and have served to moderate the tone. During the Oxford debate, I noted a remarkable absence of emotionalism. I came away from the event sensing that, while some still defend humanitarian intervention, their arguments lack the crusading tone that was so noteworthy in the past. I sense that public support for interventionism is beginning to ebb.

 

What follows is a verbatim transcript of the full statements by myself and Mr. Chertoff, as well as our responses to questions posed by the moderator and a member of the audience. For reasons of brevity, I have omitted most of the audience questions, as well as the responses. Interested readers can find the full debate at the Oxford Union’s Youtube site.

 

 

Daniel Wilkinson, Oxford Union President

So, gentlemen, the motion is: “This house believes humanitarian intervention is a contradiction in terms.” And Professor Gibbs, your ten-minute opening argument can begin when you’re ready.

 

Professor David Gibbs

Thank You. Well, I think that when one looks at humanitarian intervention, one has to look at the record of what has actually happened and in particular the last three major interventions since 2000: The Iraqi intervention of 2003, the Afghanistan intervention of 2001, and the Libya intervention of 2011. And what all three of these have in common, is that all three were justified at least in part on humanitarian grounds. I mean, the first two partly, the third almost exclusively were justified on humanitarian grounds. And all three produced humanitarian disasters. This is really quite clear, I think to anybody who has been reading the newspaper that these interventions have not gone well at all. And when evaluating the larger issue of humanitarian intervention, one really has to first look at those basic facts, which are not pleasant. Let me add that it’s very surprising to me in a lot of ways that the whole concept humanitarian intervention wasn't just fully discredited by those experiences, but it is not. 

 

We still have calls for other interventions, including in Syria, most notably. Also, there are frequent calls for regime change, essentially intervention, in North Korea. I really don't know what is going to happen in the future with North Korea. But if the United States does undertake regime change in North Korea, I will hazard two predictions: One, it almost certainly will be justified at least in part as a humanitarian intervention designed to liberate the people of North Korea from a very unwholesome dictator; and two, it'll produce probably the biggest humanitarian disaster since 1945. One of the questions is: Why are we not learning from our mistakes? 

 

The scale of the failures in these three previous interventions is in a lot of ways quite impressive. With regard to Iraq, it's perhaps the best documented failure, I would say. We have the 2006 Lancet study. Epidemiologically looking at excess deaths in Iraq, which at that time were estimated at 560,000 excess deaths.(1) This was published in 2006. So, presumably it's much higher by now. There have been other estimates, mostly on par with that one. And this is something that is problematic. Certainly, things were terrible under Saddam Hussein, that’s indisputable, as they were under the Taliban, as they were under Muammar Gaddafi, as they currently are under Kim Jong Un in North Korea. And so, we went in and removed from power those three figures one by one (or I should say with the Taliban, it was a larger regime, with Mullah Omar leading a larger regime), and things promptly got worse. It didn't seem to have occurred to policymakers that things could actually get worse, but they did. 

 

Another effect that's worth noting is what I would say is a kind of destabilization of regions. This is particularly striking in the case of Libya, which destabilized much of North Africa, triggering a secondary civil war in Mali in 2013, which was directly attributable to the destabilization of Libya. This required a secondary intervention, by France this time, to combat basically the instability arising in that country, again justified at least in part on humanitarian grounds. 

 

Certainly, one of the things one can say in terms the effects of humanitarian intervention, is that if you have a vested interest in intervention and that is something you are seeking, it's an excellent idea because it's the gift that just keeps on giving. It keeps on destabilizing regions, producing new humanitarian crises, thus justifying new interventions. That's certainly what happened in the case of Libya and then Mali. Now if you're interested in humanitarian effect, however the situation does not look so good. It does not look very positive at all. 

 

The very striking thing here is the lack of loss of credibility. I'm very struck by the fact that the people who helped to argue for these three interventions -- and by that I don't just mean policymakers, but also academics and intellectuals like myself. I myself didn't argue for them, but many of my colleagues did. And it's rather remarkable to me that there's no expression of regret or acknowledgement they did anything wrong in arguing for these interventions. Nor is there effort to learn from our mistakes and to try and avoid interventions in the future. There's something very dysfunctional about the character of discussion on this topic, when we fail to learn from past mistakes. 

 

A second problem with the issue of humanitarian intervention is what some have called the “dirty hands” problem. We are relying on countries and agencies of those countries which do not have very good records of humanitarian activity. Let us look at the United States and its history of interventionism. If one looks at that, the history of US interventionism, we find the United States as an intervening power was a major cause of humanitarian crises in the past. If one looks for example at the overthrow of Mossadegh in Iran in 1953, the overthrow of Allende in Chile in 1973. And I think the most striking example, a less known one, is Indonesia in 1965, where the CIA helped engineer a coup and then helped orchestrate a massacre of people that led to about 500,000 deaths. It's one of the really great massacres post-1945, yes indeed, on the scale of what happened in Rwanda, at least approximately. And that was something caused by intervention. And one could also go into the issue of the Vietnam War and look for example at the Pentagon Papers, the secret Pentagon study of the Vietnam War, and one does not get a sense of the United States as either a gentle power or a particularly humanitarian one. And the effects certainly were not humanitarian in any of these cases. 

 

There's a larger issue perhaps of human rights violations by the agencies of state that are involved in intervention in the United States. We now know from declassified documents that both the uniformed military and the CIA were responsible in the 50s and early 60s in conducting radiation experiments on unsuspecting individuals; doing things like going around and having doctors working for the military injecting people with radioactive isotopes and then tracking their bodies over time to see what effects it had and what kinds of illnesses it caused them -- without telling them of course. The CIA had very disturbing mind-control experiments, testing new interrogation techniques on unsuspecting individuals, with very damaging effects. One of the scientists involved in the radiation studies commented in private, again this is from a declassified document, that some of what he was doing had what he called the “Buchenwald” effect, and we could see what he meant. And the obvious question again is: Why on earth would we want to trust agencies that do things like this to do something humanitarian now? This is a course long ago. But the fact that we now use the term “humanitarian intervention” does not make it a magical phrase and does not magically erase this past history, which is relevant and has to be taken into account. I do not want to focus excessively on my own country after all. Other states have done other disturbing things. One could look at the history of Britain and France, let us say, with the colonial and postcolonial interventions. One does not get a picture of humanitarian activity; quite the contrary I would say, either in intent or in effect. 

 

Now I think one of the issues that finally has to be noted is the cost of humanitarian intervention. This is something that is rarely taken into account, but perhaps should be taken into account, especially since the record of results is so bad in terms of humanitarian effect. Well, military action generally speaking is extremely expensive. Amassing division-sized forces, deploying them overseas for extended periods of time cannot be done except at extreme expense. In the case of the Iraq War, what we have is what has been termed “the three trillion-dollar war.” Joseph Stiglitz of Columbia and Linda Bilmes estimated in 2008 the long-term cost of the Iraq War at $3 trillion.(2) Those figures of course are obsolete, because that's over ten years ago, but $3 trillion is quite a lot when you think about it. In fact, it's greater than the combined gross domestic product of Great Britain at the present time. And one wonders what kind of wonderful humanitarian projects we could have done with $3 trillion, rather than wasting it in a war that did nothing but killed several hundred thousand people and destabilized a region. 

 

And these wars are not over of course in either Libya, nor Iraq, nor Afghanistan. Afghanistan is nearing the end of its second decade of war and the second decade of US intervention. This may very well run into being the longest war in US history, if it not already is. It depends how you define longest war, but it's certainly getting up there. And one can think of all sorts of things that could have been done with some of this money, for example, vaccination of children, who are under-vaccinated. (Two minutes is that right? One minute.) One could think of people who don't have enough medicines including in my own country the United States, where many people go without proper medicines. As economists know, you have opportunity costs. If you spend money on one thing, you may not have it available for another. And I think what we've been doing is overspending on intervention again with no significant humanitarian results or very few that I can discern. I guess I'm very impressed by the medical analogy here and the medical emphasis, so that's of course why I titled my book “First Do No Harm.” And the reason is that in medicine you don't just go and operate on the patient because the patient is suffering. You have to do a proper analysis of whether or not the operation will be positive or negative. An operation can of course hurt people, and in medicine sometimes the best thing to do is nothing. And perhaps here, the first thing we should do with the humanitarian crises is not make them worse, which is what we've done. Thank you.

 

Wilkinson

Thank you, Professor. Michael, your ten-minute argument can begin when you’re ready.

 

Michael Chertoff

The proposition here is whether humanitarian intervention is a contradiction in terms, and I think the answer to that is no. Sometimes it’s ill-advised, sometimes, it's well advised. Sometimes it doesn't work, sometimes it does work. It rarely works perfectly, but nothing in life does. So, let me first begin by talking about the three examples the professor gave: Afghanistan, Iraq, and Libya. I'm going to tell you Afghanistan was not a humanitarian intervention. Afghanistan was the result of an attack launched on the United States that killed 3,000 people, and it was quite openly and deliberately an effort to remove the person who launched the attack from the ability to do it again. If you think it wasn't worth it, I will tell you from personal experience: When we went into Afghanistan, we found laboratories al Qaeda was using to experiment with chemical and biological agents on animals, so they could deploy those against people in the West. Had we not gone into Afghanistan, we might be inhaling those now as we speak. This is not humanitarian in the sense of altruistic. This is kind of basic, core security that every country owes its citizens. 

 

Iraq is also I think in my view not principally a humanitarian intervention. We can debate in a different debate what happened with the intelligence, and whether it was totally wrong or only partially wrong, regarding the possibility of weapons of mass destruction in Iraq. But at least that was the major assumption going in. It may have been erroneous, and there are all kinds of arguments that the way in which it was executed was poorly done. But again, it was not humanitarian. Libya was a humanitarian intervention. And the problem with Libya is I think the second part of what I want to say, which is not all humanitarian interventions are good. And in order to make a decision to intervene, you have to take into account some very important elements of what you're facing. What is your strategy and your objective, do you have clarity about that? What is your awareness of what the conditions in the place you're intervening in actually are? What are your capabilities and your willingness to be committed to see things through to the end? And then, to what degree do you have support from the international community? Libya is an example of a case where, while the impulse may have been humanitarian, these things were not carefully thought-out. And if I can say so, Michael Hayden and I made this point in an oped shortly after this process began.(3) That the easy part was going to be removing Gaddafi. The hard part was going to be what happens after Gaddafi is removed. And so here I agree with the professor. Had someone looked at the four factors I mentioned, they would have said: “Well you know, we don't really know, we haven’t really though through what happens without Gaddafi?” What happens to all the extremists in prison? What happens to all the mercenaries that he's paid for, who now aren't getting paid anymore? And that led to some of the negative results. I also think there was a failure to understand that when you remove a dictator, you have an unstable situation. And as Colin Powell used to say, if you broke it you bought it. If you're going to remove a dictator, you've got to then be prepared to invest in stabilizing. If you're not prepared to make that investment, you have no business removing him. 

 

By way of example on the other side, if you look at for example the interventions in Sierra Leone and Ivory Coast. Sierra Leone was 2000. There was the United Front that was advancing on the capital. The British came in, they repelled them. They drove them back. And because of that, Sierra Leone was able to stabilize, and they ultimately wound up having elections. Or Ivory Coast, you had an incumbent who refused to accept that he had lost an election. He began to use violence against his people. There was an intervention. He was ultimately arrested, and now Ivory Coast has a democracy. So again, there are ways to do humanitarian intervention that can be successful, but not if you don't pay attention to the four characteristics I talked about. 

 

Now, let me give you an example from something that we are literally facing today, and that is what is going on in Syria. And let's ask the question whether a couple of years ago, before the Russians got deeply involved, before the Iranians got deeply involved, whether an intervention would have made a difference in saving literally tens of thousands of people from being killed, innocent civilians with bombs and chemical weapons, as well as a huge mass migration crisis. And I think the answer is: Had we done in Syria what we did in northern Iraq in 1991, established a no-fly zone and a no-go zone for Assad and his people, and if we had done it early, we might have averted what we now see unfolding and continuing to unfold in the region. So, now I'm going to now look at it from the other lens: What happens when you don't intervene, as I suggest that we might have done in Syria? Well not only do you have a humanitarian crisis, you have a security crisis. Because as the consequence of not really enforcing any of the rules I've talked about and notwithstanding the fact that President Obama said there was a red line about chemical weapons and then the line disappeared when the chemical weapons were used. Because of the fact that we didn't enforce these humanitarian measures, we had not only many deaths, but we literally had an upheaval that has now reached into the heart of Europe. The reason the EU is now having a crisis about migration is because, and perhaps with some intent, the Russians as well as the Syrians deliberately acted to drive civilians out of the country and force them to go elsewhere. Many of them are now in Jordan and putting a strain on Jordan, but many of them are trying to get into Europe. And I have little doubt that Putin understood or quickly recognized, even if it was not his original intent, that once you create a migration crisis, you are creating a disorder and dissension within your principal adversary, which is Europe. And that has a destabilizing effect, the consequences of which we continue to see today. 

 

And so, one of the things I want to say to be honest, is when we talk about humanitarian intervention, there is often an altruistic dimension to it, but frankly there is also a self-interested dimension. Places of disorder are places where terrorists operate, and you've seen Isis until quite recently had territory in parts of Syria and parts of Iraq that were not properly governed. It creates migration crises and similar crises, which then have an impact on the stability and the good order of the rest of the world. And it also creates grievances and desires for payback that often result in cycles of violence that continue over and over again, and you see that in Rwanda. 

 

So, my bottom line is this: Not all humanitarian interventions are warranted, not all humanitarian interventions are properly thought out and properly executed. But by the same token, not all of them are wrong or improperly executed. And again, I go back to 1991 and the no-fly zone and no-go zone in Kurdistan as an example of one that worked. The key is this: Be clear why you're going in; don't underestimate the cost of what you're undertaking; have the capabilities and the commitment to see that you can handle those costs and achieve the result that you set out for yourself. Make sure you are aware of the conditions on the ground, so you make a rational assessment. And finally get international support, don't go it alone. I think in those circumstances, humanitarian intervention can not only be successful, but it can save a lot of lives and make our world more secure. Thank you.

 

Question (Wilkinson)

Thank you, Michael. Thank you both for those introductory remarks. I’ll ask one question, and then we’ll move over to questions from the audience. My question is this: You both cited a number of historical examples. But would you say it is a fair assessment that practically the problem is that there can never be a sufficient long-term plan, sufficient well intentions, sufficient benevolent motivations, or a sufficient harm-analysis to counter the fact that individual organizations and international organizations are fallible. And they will always make mistakes. And the fallibility of those groups means that humanitarian intervention has to be a contradiction in terms. So, Michael, if you’d like to respond. 

 

Answer (Chertoff)

My answer is this: Inaction is action. Some people think if you don't do something that's somehow abstaining. But if you don't do something, something is going to happen. So, if for example Franklin Roosevelt had decided not to help the British in 1940 with Lend Lease, because “I don't know if I'm making a mistake or not,” that would have resulted in a different outcome with respect to World War II. I don't think we'd be saying “well but that was inaction, so it didn't matter.” I think inaction is a form of action. And every time you're presented with a choice, you have to balance the consequences as far as you can project them, from both doing something and abstaining from doing something. 

 

Answer (Gibbs)

Well, I think that of course inaction is a form of action, but the onus should always be on person advocating intervention. Because let's be very clear on this: Intervention is an act of war. Humanitarian intervention is a mere euphemism. When we advocate humanitarian intervention, we are advocating war. The movement for intervention is a movement for war. And it seems to me those who advocate against war really have no burden on them of proof. The burden of proof should be on those who advocate for the use of violence, and really the standards should be very high for the use of violence. And I think we can see it's been used quite frivolously in the past to an extraordinary degree. 

 

And a basic problem you have in small interventions -- for example the 1991 no-fly zone over Iraq -- is these things take place in the real world, not in a pretend world. And in that real world, the United States considers itself a great power, and there'll always be the question of American credibility. And if the U.S. undertakes half measures, such as a no-fly zone, there will always be pressures on the United States from various factions in the foreign policy establishment to take a more maximalist effort and solve the problem once and for all. Hence the need for another war with Iraq in 2003, producing an utter catastrophe. I get very queasy when I hear people discussing “let us just do a limited intervention, it'll just stop at that,” because it usually doesn't stop at that. There's the quagmire effect. You step into the quagmire, and you get deeper and deeper into the quagmire. And there will always be those who advocate deeper and deeper intervention.

 

I guess one more point: I did want to respond to the claim which is a frequent one that the Iraq and Afghanistan wars were not really humanitarian interventions. It is true that this was to some extent, both interventions were at least partly traditional national interest, realpolitik, and the like. But if you look back at the record, clearly both were justified in part as humanitarian interventions, both by the Bush administration as well as many academics. I have here before me an edited volume published by the University of California Press, and I believe it's 2005, called A Matter of Principle: Humanitarian Arguments for War in Iraq.”(4) Just do a Google search on “humanitarian arguments for war in Iraq,” and this was very much part of the picture.  I think it's a bit of a rewriting of history to say that humanitarian intervention was not a significant factor in the arguments for war in Iraq or Afghanistan. They were very much part of both those wars.  And I would say the results very much discredit the idea of humanitarian intervention.

 

Question (Audience)

Thanks, so you've both talked about some historical examples and I'd like to hear both of your perspectives about the ongoing situation in Venezuela. And the Trump administration and the plans and the reports have come out that they might have plans to use military force there and how you would evaluate that in light of both of the perspectives that you've shared.

 

Answer (Chertoff)

So, I think what's happening in Venezuela is first of all I mean there's obviously a political dictatorship. And as I've said I don't think political regime issues are a reason to intervene militarily. There is also a humanitarian element here. People are starving. But I don't know we’re at the level of humanitarian crisis that we've seen in other cases. So, my short answer would be: I don't think we've met the threshold for having a real discussion about humanitarian intervention in a military sense. 

 

That's not to say there aren't non-military ways to intervene, just to be clear so we round the picture out. There are a lot of tools in the toolbox when you deal with intervention. There are sanctions, economic sanctions. There is even potential use of cyber tools as a way of having some impact on what's going on. There is the possibility in some instances of legal action, for example International Criminal Court or something. So, all of these ought to be considered part of the toolbox. If I was looking at Venezuela, assuming it did, which I emphasize it has not, reach the level of humanitarian intervention, you would then have to balance issues like: Is there an endgame we see or a strategy we see to be successful? Do we have the capabilities to achieve it? Do we have international support? I think all of those would probably militate against it. That's not to say it couldn't change, but the dimensions of this I don't think have reached the point where military action is reasonable or likely.

 

Answer (Gibbs)

Well, the most important thing you need to know about Venezuela is that it's an undiversified oil exporting economy, and there's been a drop in oil price since 2014. I'll certainly grant that a lot of what is going on now is the fault of Maduro and authoritarian actions he's been taking, as well as mismanagement, corruption, and so on. Most of what has been going on by any reasonable reading, by any informed reading, is due to low oil prices. 

 

It points to I think a larger issue, which is the way humanitarian crises are often triggered by economic crises. Discussions of Rwanda almost never discuss the fact that the genocide – and I think it really was a genocide in the case of Rwanda -- the genocide by the Hutu against the Tutsi took place in the context of a major economic crisis resulting from the collapse of coffee prices. Again, a very undiversified economy that was reliant almost exclusively on coffee. Coffee prices collapse, you get a political crisis. Yugoslavia had a major economic crisis just before the country broke up and descended into hell. We know about the descent into hell, most people don't know about the economic crisis. 

 

For some reason people find economics boring, and because it's boring and military intervention seems more exciting, we think that the solution is to send in the 82nd Airborne Division. Whereas perhaps it would have been simpler and a lot cheaper and easier and better from a humanitarian standpoint to address the economic crisis; the very heavy emphasis placed on austerity in the international economic system and the very damaging political effects austerity has in many countries. Historical context is necessary here: For all the constant, repetitious references to the Third Reich and to World War II, which we hear again and again and again and again, people often forget that one of the things that brought us Adolph Hitler was the Great Depression. Any reasonable reading of Weimar Germany's history would be that without the Depression, you almost certainly would not have gotten the rise of Nazism. So, I think a greater addressing of the economic issues in the case of Venezuela -- Even if the United States were to overthrow Maduro by whatever means and replace them with someone else,  that someone else would still have to deal with the issue of low oil prices and the damaging effects on the economy, which would remain unaddressed by humanitarian intervention, whether we call it that or something else. 

 

I guess another point about the United States and Venezuela is that the United Nations sent a representative down there and condemned the US sanctions as greatly intensifying the humanitarian crisis. So, the intervention the United States has been doing -- economic at this point mostly, rather than military -- is making things worse, and that clearly has to stop. If we're interested in helping the people of Venezuela, surely the United States would not want to make it worse.

 

(1) Gilbert Burnham, et al, “Mortality after the 2003 Invasion of Iraq: A Cross Sectional Analysis Cluster Sample Survey,” Lancet 368, no. 9545, 2006. Note that the Lancet’s best estimate of excess deaths due to the invasion is actually higher than the one I cited above. The correct figure is 654,965, rather than the 560,000 that I presented.

(2) Linda J. Bilmes and Joseph E. Stiglitz, The Three Trillion Dollar War: The True Cost of the Iraq Conflict. New York: Norton, 2008.

(3) Michael Chertoff and Michael V. Hayden, “What Happens after Gaddafi is Removed?” Washington Post, April 21, 2011.

(4) Thomas Cushman, ed., A Matter of Principle: Humanitarian Arguments for War in Iraq. Berkeley: University of California Press, 2005.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172461 https://historynewsnetwork.org/article/172461 0
Historian Ian Reifowitz on How the Race-Baiting Invective of Rush Limbaugh on the Obama Presidency Led to Trump

 

Ultimately, the right wing needs white racial anxiety. In fact, it cannot survive without it.

Ian Reifowitz, The Tribalization of Politics

 

On January 20, 2009, Barack Obama was inaugurated as the forty-fourth president of the United States of America—the first African American to attain this exalted office. Hundreds of thousands crowded the National Mall during the ceremony to wish the new president well.

 

However, rather than offering the president words of encouragement and congratulations, voices from the far right almost immediately expressed the hope that President Obama would fail and serve no more than one term. He had inherited a faltering economy, a war, a country still divided by race and other vexing issues, while the right-wing media labeled him as anti-American and unpatriotic, as a black president who would please his constituents of color to the detriment of white citizens. 

 

Popular far-right talk radio host Rush Limbaugh was one of the most vociferous voices, and was the one with the largest audience. He uttered unceasing, racially-charged attacks on President Obama virtually every day of his two terms in office. 

 

Historian Ian Reifowitz examines Limbaugh’s hateful invective and American political polarization in his new book, The Tribalization of Politics: How Rush Limbaugh’s Race-Baiting Rhetoric on the Obama Presidency Paved the Way for Trump (Ig Publishing).

 

To better understand the attacks on President Obama, Professor Reifowitz took on the daunting task of analyzing the transcripts of Limbaugh’s radio shows and associated materials from the Obama years. As a result, Professor Reifowitz has documented the manifold instances of Limbaugh’s hateful race-baiting and “othering” of the president. And Limbaugh has profited greatly as a leader in sparking white fear of racial peril. 

 

The book traces the election of Donald Trump and the recent rise in white supremacist activity to the incendiary language of racism that the right-wing relies on to win politically. Historian Keri Leigh Merritt commented that The Tribalization of Politics is “is a must-read for anyone seeking to understand how the US has reached its lowest point in race relations since the Civil Rights Movement.” 

 

Professor Reifowitz teaches history at Empire State College of the State University of New York. His other books include Imagining an Austrian Nation: Joseph Samuel Bloch and the Search for a Multiethnic Austrian Identity, 1846-1919, and Obama’s America: A Transformative Vision of Our National Identity. He has published a number of academic articles in the Journal of Jewish Identities, Nationalities Papers, and East European Quarterly, among others. Professor Reifowitz is also a contributing editor at Daily Kos, and his articles have appeared in the Daily News, Newsday, The New Republic, In These Times, Truthout, Huffington Post, and others. His awards include the 2009 Susan H. Turben Award for Scholarly Excellence, and the 2014 S.U.N.Y. Chancellor's Award for Excellence in Scholarly and Creative Activities.

 

Professor Reifowitz graciously responded to a series of questions in an email exchange on his work and his new book. 

 

Robin Lindley: You’re a historian specializing in the modern history of the United States. How did you decide to study history, and then to focus on the American past?

 

Professor Reifowitz: I’ve always, since I was in college (too many years ago) been interested in multiethnic societies, and specifically how they work to create ‘national’ bonds across lines of ethnicity to bind together their diverse population. 

 

My graduate study, which led to my first book and other early academic publications, focused on Austria-Hungary. That state tried and failed to create strong enough national bonds, i.e., bonds based on citizenship in and loyalty to a common state, that would have allowed it to survive World War I and the overthrow of the Habsburg dynasty. Even while pursuing that research, I’d also been reading and thinking about another multiethnic society, the one we live in, that faces some of the same issues (thankfully, we don’t have to rely on a monarchy as the foundation of our unity). 

 

Eventually, my passion for understanding how unity and diversity were playing out today drove me to begin writing and researching the contemporary U.S. I published a couple of articles in The New Republic, and later in other outlets, and began to read more deeply and develop my ideas further. Then along came Barack Obama. I wrote my previous book, Obama’s America, in which I examined his conception of American national identity, one that incorporates pluralism and inclusiveness into a strong, unifying vision of national community that, one would hope, Americans of every background could adopt. 

 

Robin Lindley: How did you come to write about Rush Limbaugh’s race-baiting rhetoric in the Obama Era? Did the project grow out of your past research for Obama’s America?

 

Professor Ian Reifowitz: To continue the story from above, one section of Obama’s America examined critics (mostly on the right, but a few to Obama’s left) who criticized Obama’s vision of American national identity. I had spent some pages examining Rush Limbaugh’s rhetoric from the first couple of years of Obama’s presidency, and that had energized me (albeit with a sort of dark energy, compared to the more uplifting work of looking at Obama’s writings and speeches). 

 

Then, in the summer of 2015, the idea came to me for another book, and I thought: why not do a comprehensive, close examination of everything Limbaugh said about the Obama presidency. I put together a proposal, started the work in late 2015 and kept up the research until Obama left office, and then started writing. In the meantime, of course, Trump had emerged and been elected. Trump’s campaign and then victory helped me decide to focus the book on Limbaugh’s race-baiting, both in order to document it in a comprehensive way for people, and to draw parallels between what he was doing—playing and preying on white anxiety—and what Trump did in his campaign (and, to be sure, for years beforehand, starting with his own incendiary, racist rhetoric about the Central Park Five and right up to him claiming the mantle of birther-in-chief).

 

Robin Lindley: You address our current political “tribalization” by focusing on Limbaugh’s rhetoric. In your view, what is tribalization, and how does it affect our politics now?

 

Professor Ian Reifowitz: I’ll give you the definition I used in the book’s introduction rather than come up with something off the top of my head.

 

“Tribalization refers to a transformation much more profound than merely convincing Americans to be partisans who vote based on a shared set of policy preferences. It means cleaving America in two, and, in the case of Limbaugh, creating a conservative tribe animated somewhat by political ideology, but more so by racial and cultural resentment that feeds a hatred of the opposing tribe."

 

Robin Lindley: This new Limbaugh project had to be daunting and possibly distasteful to you in view of your past research on President Obama and your favorable view of his efforts to unite our diverse nation. How did you feel as you put your book together? 

 

Professor Ian Reifowitz: Well, I did mention above that I felt a different kind of passion motivating me on this project compared to the Obama book. But I have to admit that, once I got deep into the research, there were times when I wished I hadn’t committed to the project. There were plenty of times that I didn’t want to read through another word of Limbaugh. 

 

I guess my stubborn streak helped. I wasn’t going to abandon a project that I’d already invested so much time and energy in, and certainly wasn’t going to do so because Limbaugh’s rhetoric was hard to stomach. I hoped I was doing something important, that could make some connections that would help people better understand where our politics has gone in the past few years.

 

Robin Lindley: What was your research process for your new book?

 

Professor Ian Reifowitz: I went to RushLimbaugh.com and read through the transcripts for every show he did during the eight years Barack Obama was president, which he thoughtfully published free of charge. To be honest, if the transcripts didn’t exist, I don’t know that I could have done the research by listening to the audio recordings. That might have been too much. Thankfully I didn’t have to find out. 

 

I also read secondary sources on contemporary politics, in particular on matters of race and identity. After I started focusing on the connections between Limbaugh and Trump, I read political science scholarship on public opinion in 2016, which documented how white anxiety and resentment correlated with votes for Trump both in the primary and general election, and I incorporated that information into my analysis.

 

Robin Lindley: What did you learn about Limbaugh’s origins?

 

Professor Ian Reifowitz: I read some about his rhetoric in early years, how he had used racist language even before making his turn toward talking full-time about politics in the 1980s. But the focus of the book is on what he said about Obama, which spoke for itself. To clarify, I don’t care if he actually believes what he’s saying, because the effect his words have is the same whether he’s just a cynical opportunist or a true believer. I’m not especially interested in his motivations.

 

Robin Lindley: How did you come to focus on Limbaugh in your book. You see Limbaugh as a major force in dividing the US during the Obama era, but other potent Obama detractors included the current president, Senator Mitch McConnell and much of the Republican Party, Fox News, the Tea Party, and others. How would you weigh Limbaugh’s influence, if possible?

 

Professor Ian Reifowitz: My background, in terms of the kind of work I do, focuses on analyzing political rhetoric. Limbaugh was the person whose rhetoric I chose to examine because he broadcasts about two hundred shows a year, so there would be essentially no important issue relating to the Obama presidency that he would not address. Plus, he had the largest radio audience in the country throughout all eight years Obama was president (and decades before as well, and even in the years since up through the most recent month). 

 

I used him as a case study—where the biggest part stands in for the whole of the right-wing media. The transcripts helped as well, as it would be impossible to read every word broadcast on, say, Fox. This way, I had a closed, yet comprehensive, set of data to use as my source base.

 

Robin Lindley: Thanks for explaining your process. Do you see Limbaugh as an ally of white supremacist organizations such as the Ku Klux Klan and the American Nazis? 

 

Professor Ian Reifowitz: His show helps push sanitized versions of some of their ideas into the mainstream. The views he expresses are not the same as the views of the KKK or American Nazis, but he taps into some of the same hate and fear that they do. I don’t think that makes him an ally, but more like an enabler.

 

Robin Lindley: How did Limbaugh view Senator Obama before he was elected in 2008? 

 

Professor Ian Reifowitz: I didn’t look at the pre-inauguration rhetoric in a comprehensive way, but from what I saw nothing changed on Election Day.

 

Robin Lindley: How did Limbaugh usually describe President Obama? 

 

Professor Ian Reifowitz: You want the whole book in a nutshell? Here’s a brief summary from the book:

 

“While Obama was president, Limbaugh constantly, almost daily, talked about him using a technique that scholars call “racial priming”—in other words, he race-baited. The host aimed to convince his audience that Obama was some kind of anti-white, anti-American, radical, Marxist, black nationalist, and possibly a secret Muslim to boot. This was neither a bug nor a supporting element of Limbaugh’s presentation, but instead stood as a central feature deployed strategically in order to accomplish a very specific task, a task reflected in the title of this book. The tribalization of politics is exactly what Limbaugh set out to achieve.”

 

I’ll add: “[Limbaugh] portrayed him in a way designed to exacerbate white racial anxiety about a black president, or depicted him as a foreign “other,” outside the bounds of traditional Americanness.”

 

Robin Lindley: How did Limbaugh exploit Islamophobia and the fear of immigrants to attack President Obama?

 

Professor Ian Reifowitz:  He repeatedly sought to portray Obama as some kind of “secret Muslim” or somehow more sympathetic to Muslims—even terrorists—than to Christians and/or the interests of the United States. On immigrants, I’ll give you the following example:

 

“On July 1, 2015, two weeks after Trump’s infamous comments [made during his announcement that he was running for president] about Mexican immigrants being rapists and bringing drugs into the United States, a woman named Kathryn Steinle was shot and killed in San Francisco by Jose Inez Garcia Zarate [an undocumented immigrant with a criminal record].

 

“. . . On the campaign trail, Trump pounced, and Limbaugh followed suit a few days later. On July 7, in comments designed to inflame white racial resentment, the host claimed that Steinle’s name would “never be as well-known as Trayvon Martin,” and that the president would not deliver the eulogy at her funeral, even though Obama had not delivered a eulogy at the funeral of Martin or any other citizen killed by police. Obama did, however, speak at the memorial service for the five Dallas police officers murdered a year later….Limbaugh speculated that the president did not care about Steinle’s murder, and blamed it on the administration’s immigration policies, which were “coming home to roost”—this was a phrase uttered by Reverend Jeremiah Wright that was discussed so often on Limbaugh’s show. The host again talked about Obama hating America and wanting to alter its “composition” in order to change “the face of the country.”  

 

Limbaugh attacked the president over Steinle on three more shows over the next week. On July 15, 2015, the host contrasted Obama not having contacted the Steinle family to his having written letters to forty-six felons whose sentences he commuted, and to his outreach to the family of Michael Brown in Ferguson. Limbaugh’s point was to remind his listeners that Obama cared more about prisoners (read: black and Hispanic people) and black people killed by cops than a white woman who was murdered by someone here illegally. If there’s one segment that both encapsulates Limbaugh’s tribalizing history of the Obama presidency, and shows how his race-baiting rhetoric set the way for the rise of Trump, this was it.

 

Robin Lindley: How did President Obama respond to Limbaugh’s attacks, particularly in terms of dealing with claims that he was pro-Muslim, anti-police, and anti-white? 

 

Professor Ian Reifowitz: He basically ignored them, but I did not examine Obama’s responses comprehensively.

 

Robin Lindley: Limbaugh attacked President Obama almost daily during his eight years in office. For Limbaugh, it seems that the ideals of equality, tolerance, democracy, community, and serving the common good are anathema and, indeed, anti-American. What is your sense of Limbaugh’s view of these ideals? 

 

Professor Ian Reifowitz: He would pay lip service to most of those ideals in the abstract, while attacking Obama and other liberals for seeking to change the traditional definition of them to something involving retribution and reparations that would take from whites and give to non-whites. He would turn any criticism of racial inequality in America back around and argue that the problem of racism in America stemmed from people overexaggerating it. For example, on July 25, 2013, Limbaugh “accused “the left” of wanting “race problems” to remain unsolved, and in fact wanting to make them worse. Why? Because “too many people make money off of racial strife, and therefore they’re always going to promote it.” Here’s a quote from May 26, 2010, about Obama and liberals in general: “everything’s about race. Everything is about skin color to these people, or however they classify people, however they seek to group them, whatever, they’re victims.” This is how he viewed racism in America. 

 

Robin Lindley: Beyond Limbaugh, what are some things that you learned about the massive right-wing media misinformation machine?

 

Professor Ian Reifowitz: I didn’t do too much with them, because they do generally move in lockstep. I did note in the book that in the summer of 2018, Tucker Carlson and Laura Ingraham on Fox News echoed Trump’s language of white anxiety regarding immigration and demographic changes. Limbaugh had spoken similarly as well during the Obama presidency, which I documented in greater depth.

 

Robin Lindley: Were there any particular findings that surprised you as you researched and wrote your book?

 

Professor Ian Reifowitz: Nothing really surprised me in terms of ideology, Rush pretty much delivered exactly what I expected when I started the research. I was already pretty familiar with his bile. However, when I came across Limbaugh’s comments connecting Tiger Woods and his sex-related scandals involving white women to Obama—which suggested that the president might be involved in something analogous based on little more than the fact that both were multiracial guys with a similar skin tone—that was something beyond even what I had expected. There were also a few times, at least until I got used to it, when I was surprised by how baldly Limbaugh just lied about facts and statistics, in particular regarding the economy. Either he really didn’t understand them, which is not likely to be true because he didn’t manipulate them to make President Trump look bad—only President Obama—or he just thought lying was the right thing for him to do.

 

Robin Lindley: Limbaugh wasn’t new to exploiting race to divide Americans. In fact, that’s been a Republican strategy for decades. What did you find about how Republicans use race to their political advantage? 

 

Professor Ian Reifowitz: I wrote this in the book: “As journalist Dylan Matthews noted in an article entitled “Donald Trump Has Every Reason to Keep White People Thinking About Race,” a vast corpus of social science research indicates that “even very mild messages or cues that touch on race can alter political opinions,” and added that “priming white people to so much as think about race, even subconsciously, pushes them toward racially regressive views.”

 

Robin Lindley: What do you see as Limbaugh’s role in the election of President Donald Trump?

 

Professor Ian Reifowitz: I’ll share an example from the book, with some data, that demonstrates the role Limbaugh’s race-baiting rhetoric played in paving the way for Trump:

 

“Public opinion research data suggests that exactly this kind of rhetoric helped move some whites who had previously voted for Obama into Trump’s column by 2016—most Obama-Trump voters expressed high levels of anger toward non-whites and foreigners. It might be hard to imagine Obama voters being bigoted, but John Sides, Michael Tesler, and Lynn Vavreck found that significant numbers of whites who voted for Obama in 2012 expressed varying degrees of white racial resentment while also overwhelmingly embracing liberal positions on issues such as taxation and the existence of climate change. It might be surprising, but about 25% of those whites who found interracial couples unacceptable nonetheless voted for Obama in both 2008 and 2012. The country’s racial climate during Obama’s second term contributed to this phenomenon of racially resentful white Obama voters shifting to Trump, as [according to Zack Beauchamp at Vox] Black Lives Matter and Ferguson “kicked off a massive and racially polarizing national debate over police violence against African Americans.” Limbaugh took full advantage of that climate, and his race-baiting helped pave the way for Trump.”

 

Robin Lindley: What has Limbaugh been doing since Trump’s election? Does he continue to blame President Obama and Secretary Hillary Clinton for problems the nation faces. 

 

Professor Ian Reifowitz: I’ve stayed away from Limbaugh to some degree, just to give myself a break. But he’s still a huge media figure. He’s done exactly what I expected, which is the same thing he did once Trump became the presumptive nominee. He’s been a huge Trump backer and has continued to use rhetoric aimed at ginning up white anxiety, to make sure those anxious whites keep on remembering who their (false) champion is. I did check to see what Limbaugh said about Tiger Woods recently, now that Trump has embraced him, and in fact Limbaugh has done a 180, offering nothing but praise for how been Tiger has been a friend to Trump. However, while discussing Tiger and Trump, Limbaugh made sure to remind his audience that Obama is still the one to blame for exacerbating racial tensions in America. He certainly doesn’t blame Trump—or himself, for that matter. None of those things qualify as a surprise.

 

Robin Lindley: What are some good ways to counter the hateful and inaccurate rhetoric of Limbaugh and his fellow extremists in the right-wing media? 

 

Professor Ian Reifowitz: I’ll leave folks with the concluding paragraphs of the book, which are as close as I get to offering a prescription going forward regarding how to counter the Limbaugh/Trump vision of America:

 

“White racial identity has been the foundation of the single most destructive form of identity politics over the course of American history. In colonial times, slave-owners raised the status of white indentured servants—many of whom had developed close relationships with the enslaved African Americans alongside whom they worked—transforming these “plain white folks” into equal citizens and telling them that they were superior to blacks, who were thus undeserving of freedom. Why did they do this? Because the slave-owning elites had one fear above all: a white-black coalition of the masses that would unite to overthrow them. Similarly, after emancipation, the Southern economic elites made sure to bind poor whites to them through the race-based advantages conferred by Jim Crow, all in the name of thwarting that same white-black, class-based political partnership. 

 

“In this century, some working- and even middle-class whites, especially those without a college degree, have been drowning economically in a way they have not since the Great Depression. For many, whatever privilege comes with being white is not enough to keep them afloat. They are angry, afraid, and looking for a scapegoat. Limbaugh has been only too happy to oblige. He has absolutely no interest in helping the country figure out how to deal in a productive way with the white anxiety that arises from demographic change. He is interested in one thing, and one thing only: exacerbating this phenomenon in order to keep separate whites and Americans of color who do share common economic interests. That is how Republicans win elections. 

 

Limbaugh’s divisive approach, in that specific regard, is a carbon copy of the approach taken by the nineteenth century Southern white elites. The more he can get working, and middle-class whites to identify with their racial identity—their tribe—above their economic interests, the better he will be able to prevent the multiracial, progressive coalition assembled by President Obama from growing strong enough to defeat Limbaugh/Trump-style conservatism once and for all. Ultimately, the right-wing needs white racial anxiety. In fact, it cannot survive without it.

 

Robin Lindley: Thanks for those powerful words. What are you working on now?

 

Professor Ian Reifowitz: Right now? How about a nap? I’m still teaching full time, so I’ll be spending time thinking about a new project over the coming months.

 

Robin Lindley: Thank you for your thoughtful comments Professor Reifowitz and congratulations on your fascinating new book, “The Tribalization of Politics.” 

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172463 https://historynewsnetwork.org/article/172463 0
Charles Reich and The Greening of America – an Appreciation

 

Charles Reich, author of The Greening of America, passed away last month at the age of 91.

 

His book was first published in 1970 to mixed reviews: Newsweek’s Stewart Alsop called it “scary mush.” Another critic labeled it “a toasted marshmallow,” devoid of substance. Yet Reich’s book, a combination of history, sociology and philosophy struck a chord in a somber time of war and national protest. It went on to sell five million copies and become a key cultural anchoring point, a book that explained the new counterculture in a clear, uplifting manner. 

 

In 1970, I was one of the legions of long-haired, dope-smoking, anti-war protesting college students. We knew what we were against, but were struggling to define a vision of the future. Like many of my friends, I devoured Reich’s book, underlining dozens of passages. The Greening of America became a touchstone for our generation, the center of many intense conversations in campus cafeterias and smoke-filled dorm rooms.  We were angered by Nixon’s deceitful actions to prolong the Vietnam war, distrustful of a soul-crushing corporate culture and curious about the promise of new technology (NASA landed on the moon in 1969, but an affordable personal computer was still a decade away). 

 

The Greening of America spoke to our concerns with a carefully reasoned, historically anchored thesis. It explained many of the hopes and fears we felt intuitively but had not been able to articulate at length. 

 

Rather than talking about a violent political revolution, Reich described a revolution based on a new, open culture that freed men’s minds, not repressed them. He described an America that had evolved since the 1776 Revolution through three “consciousnesses.” In the first hundred years, Consciousness I, based on individual freedom and self-reliance, spurred the settling of the new nation. Consciousness II, born with the rise of industrial society in the 19th century, created a new hierarchy and demanded submission of an individual’s identity to the corporation. The rise of a mass consumer culture defined happiness in the terms of a man’s position in a hierarchy of status. Factories and workshops produced a split between the duties and identity of “man at home” versus the “man at work.” 

 

According to Reich, the post-World War II boom brought new economic security and allowed the first stirrings of Consciousness III to emerge among the children of a new, expanded middle-class. Many members of this generation sought to gain a new freedom based on a “lifestyle” that was authentic. Their identity was based on cultural interests, creativity and self-expression, not status-seeking through the accumulation of consumer goods.

 

One of the joys of Reich’s book was its optimism; one reviewer noted “It combined the rigor of an intellectual and the enthusiasm of a teenager.”

 

Greening was based on the belief that America had been founded with great hopes for personal freedom and that its Constitution allowed for major societal change. As Reich saw it, “there is a revolution underway. If it succeeds it will change the political structure as its final act. It will not require violence to succeed. Its ultimate creation could be a higher reason, a more human community and a new and liberated individual.” 

 

1950s Social Criticism

Reich’s book did not appear in a vacuum. Concern about the domination of large corporations in culture and politics and the loss of individual identity has been brewing for some time.  David Riesman’s The Lonely Crowd, a critique of the new suburban culture, appeared in 1950. In 1964, Herbert Marcuse, a philosophy professor at UC San Diego, published One Dimensional Man: Studies in the Ideology of Advanced Industrial Society.  Marcuse’s book contained scores of profound insights, but it was written at the level of a philosophy textbook. It contained numerous references to Marx, Hegel, Max Weber, Walter Benjamin and other German scholars. While we loved the title, One Dimensional Man was simply beyond the comprehension of most undergraduates.  

 

For college students today, the word “greening” today is closely associated with the environmental movement (green buildings, a Green New Deal), but Reich’s book barely touched on the environment. For him, greening meant newness, a natural growth. He compared the emerging youth culture to like “flowers poking up through the concrete.”

 

Reich wrote in 1970, before the 1973 oil embargo, when gasoline was around 30 cents per gallon and solar power was so expensive it was used only on NASA space probes. The rainforests were still intact and global warming had yet to manifest itself.  

 

Greening did not meet with universal acclaim. Critics on the far left, including Herbert Marcuse, condemned it for being “naïve,” and imaging that massive social change was possible without violent action. Marcuse, in a critique published in the New York Times in November 1970 warned that no national revolution has ever succeeded without violence. Marcuse advised that the entrenched “groups, classes, interests” in America controlled the police and armed forces. They set the priorities for America and they would not voluntarily give up any of their power.    

 

Reading Reich’s book today, some fifty years after its publication, we can see that many of the descriptions of the descriptions of repressive culture accompanying Consciousness are still valid. But his predictions about an emerging Consciousness III were off-target. 

 

From today’s vantage point, it is clear this book was written by an affluent white man working at an elite cultural institution and for an audience of well-educated young white people. Although Reich included a few quotes from Eldridge Cleaver’s recently published Soul on Ice, he never discussed the crushing poverty of inner-city ghettos, the suburbs’ segregated schools nor the structural racism still in place in the 1960s.

 

He also seemed blind to the nascent feminist movement. Betty Friedan’s The Feminine Mystique was published in 1963 and the National Organization for Women founded in 1966, so the basic tenets of feminism were known, if not yet widely practices.  His vision for Consciousness III women was confined to “liberated housewives” and enlightened school teachers. 

 

Still, Reich was eerily prescient about many other trends in American society. In Greening he posited that “the great question of these times is how to live in and with a technological society; what mind and what way of life can preserve man’s humanity against the domination of the forces he has created.”

 

He also warned of “the willful ignorance in American life.” He lamented that Americans “could be sold an ignorant and incapable leader because he looked like the embodiment of American virtues.” 

 

Reich left Yale Law School in 1974 and moved to San Francisco. He published several more books, including an autobiography, The Sorcerer of Bolinas Reef, in which he revealed his gay identity. 

 

Reich gave the younger generation hope in a dark period in American history.  He will be missed.

 

Note: Although the original 120,000-word edition of Greening of America is out of print, a condensed, 25,000 e-book version, with a new forward by Charles Reich, was published in 2012 and is available on the Internet. 

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172462 https://historynewsnetwork.org/article/172462 0
Who Says Historians Can’t Travel Back in Time?

 

Ricky Law is a historian of interwar Germany and Japan and an Associate Professor in the Department of History at Carnegie Mellon University. He received his PhD from the University of North Carolina at Chapel Hill in 2012.

 

What books are you reading now?

 

My book, Transnational Nazism: Ideology and Culture in German-Japanese Relations, 1919–1936, was just published. I still can’t help but read passages from it to come up with possible improvements. I have spent so much time in the past several years reading and revising the manuscript that I find it a little hard to kick the habit abruptly, even though there is no chance of making changes.

 

I am also beginning to prepare for my next book project on the cultural and social impact of foreign language learning in interwar and wartime Japan. This is an interdisciplinary project about history, international relations, and linguistics. I am now working my way through Robert Phillipson’s Linguistic Imperialism and Linguistic Imperialism Continued, which examine the concomitant rise of English and English-speaking countries to global predominance.

 

For personal interest, I am reading The Hellenistic World and the Coming of Rome by Erich Gruen and The Senate of Imperial Rome by Richard Talbert. As teachers, both authors were generous to indulge me in my amateurish fascination with ancient history. I am now starting to catch up on their books.

 

What is your favorite history book?

 

One favorite is hard to say. A book that left a deep impression on me was Weimar: Why Did German Democracy Fail? It is not a typical monograph but a discussion among four historians. The introduction by Ian Kershaw features some of the most illuminating writing I have read on the complex event. I encountered the book as a graduate student already familiar with the topic, but I was still struck by the clarity and insight of Kershaw’s overview of the history and historiography. The discussion following the introduction is an excellent source to learn how historians think and interact.

 

I read Christopher Browning’s Ordinary Men: Reserve Police Battalion 101 and the Final Solution in Polandt hree times – once as an undergraduate, once as a graduate, and once as an instructor. Each time I learned something new, and I think that is the mark of a great work. Another book that I pick up from time to time is History of the Later Roman Empire, Volume 2 by J. B. Bury. I could hardly put the book down the first time I read it. Good history writing involves good storytelling.

 

Why did you choose history as your career?

 

I have always been interested in the past, but I didn’t know that I could turn it into a career. I began my undergraduate studies at UC Berkeley with a major in electrical engineering and computer science. It was the heyday of the original internet boom in the Bay Area, so a career in tech was self-explanatory. But very quickly I realized that engineering did not suit me and that if I worked in tech I would be miserable so I switched majors to study what really motivated me. 

 

The greatness of UC Berkeley as a university was that I walked out of one building and into another, but I still received the same world-class education. I ended up double majoring in history and German, and was one course short of a minor in classics. I also studied abroad in Germany in my junior year. After graduation, I took a chance offer to teach English in Japan and gained a deep appreciation for the country. I decided that I wanted to learn more about the history of Germany and Japan, so I went to UNC Chapel Hill for graduate school with a project on Japanese-German relations. I actually had no intention of entering academia until very late in my graduate studies. I was planning to become a civil servant – I went to public schools all my life and wanted to give back to society. I applied to both government and academic positions, and the university job offer came at the right time.

 

What qualities do you need to be a historian?

 

Attention to detail, persistence, open-mindedness, and a willingness to take risks. History is not a subject like mathematics or physics where one can make major discoveries through personal genius alone. The past is not something that one can just “figure out” – no amount of intellectual brilliance can replace time and effort spent with historical sources. I gained some important insights in my research by noticing scribbles in books, changes to letterheads, or choice of fonts. Published materials likely went through an editing process, so even small details can reveal the thoughts of their creators. To understand the past, it is important not to confine oneself just to political, cultural, or social history, etc. These are categories for analysis but people don’t live separate lives like that. Going beyond disciplinary boundaries can be risky but can also bring unexpected results.

 

Who was your favorite history teacher?

 

I was fortunate to have many excellent mentors throughout my studies. At UNC Chapel Hill, Christopher Browning and Miles Fletcher were superb co-advisers for my dissertation on transnational history. They taught me lessons on teaching and research that I still apply in my classroom and writing. At UC Berkeley, I took every class offered by Michael Grüttner, then a DAAD visiting professor from the Technical University of Berlin. In the twenty years we have known each other, he was always ready to answer my questions and provided indispensable insights for my book. I only took a freshman seminar taught by Tom Havens, but he has since been generous with his time and supportive of my work. Although I was just one of hundreds of students in Leon Litwack’s US history survey course, we had many interesting conversations during his office hours. When I began teaching my own large lecture courses, I looked back to that experience for inspiration and guidance. The senior-thesis seminar taught by Erich Gruen was singularly enlightening. When I write, I still keep in mind his exhortations to interpret evidence more thoroughly and skeptically. My history teacher at King City High School, Paul Cavanaugh, was not only a phenomenal teacher but also a great role model.

 

What is your most memorable or rewarding teaching experience?

 

I find it most rewarding when students told me that they had dreaded history classes because they had a bad experience in high school, but my course changed their view on the subject and encouraged them to take more history classes or major in history. I teach a large, introductory survey course that is part of general education. Most of my students intend to specialize in a field other than history, so convincing any of them to take another history class or add a history major feels like a victory and validation of my effort. 

 

What are your hopes for history as a discipline?

 

History as a discipline is indispensable for democracy. It trains us to place ourselves in the shoes of others in different times and places. Thinking beyond the self and from the perspectives of others is the essence of democracy. My interactions with my students make me hopeful that history and democracy will thrive together.

 

Do you own any rare history or collectible books? Do you collect artifacts related to history?

 

I have a small library of German and Japanese books published in the interwar era. Most of these books were not accessible at the German National Library or the National Diet Library, so I had to look for them in bookstores or flea markets. Both Japan and Germany have a robust antiquarian trade. Going book hunting gave me a good reason to take a break from research or writing. Leipzig, where I stayed for many months, has historically been a center of publishing in Germany and host to one of Europe’s largest book fairs. But there is nothing quite like Jinbōchō in Tokyo, a district with streets lined with bookstores. I spent countless hours there browsing bookshelves looking for titles and discovering ones I did not know about. Many of the books I collected deal with the practicalities of intercultural interactions, such as travel guides, travelogues, and language textbooks. I am fascinated by the mechanics of how humans made sense of ideas, objects, and people from other cultures. I also have a small collection of maps from that period. Most of these maps were not for finding directions, but they gave viewers a chance to imagine traveling abroad.

 

What have you found most rewarding and most frustrating about your career?

 

My most rewarding experiences came from sharing my knowledge of history with others, be they colleagues, students, or anyone interested in learning more about the past. Seeing my book, which I have worked on for over a decade, in print is truly gratifying and makes me feel that I have added to human knowledge. But it is also frightening as it is now in the open for everyone to see and comment on. The most frustrating times were when students wrote in course evaluations that they don’t see the point of having to take a history class because they plan to have a career in STEM, performance arts, etc. I wish they would have told me earlier, so I could have tried personally to convince them that a grounding in history will make them better scientists, artists, and businesspeople. College is precisely the place to explore various areas of interest.

 

How has the study of history changed in the course of your career?

 

I may belong to the last generation of historians who remember the days before the internet. Information technology has undoubtedly transformed how we research history. Digitization allows the historian to handle an amount of material previously unimaginable. I could not have written a book on the mass media of two countries without digitized sources. Having said that, I think digitization causes some problems. One is information overload. Because it is so easy to access electronic material, it is very tempting to keep looking for the elusive “whole picture” rather than to start writing. Another is the loss of context. A keyword search can be the fastest, most efficient way to find a relevant document, but it rips the document from its surroundings, like reading newspaper clippings rather than scrolling through the pages. An even more disruptive change is that far fewer people are studying history. The interest is still there, but history is often erroneously perceived as a field that does not lead to career success. I worry that history may soon be studied only by the privileged few who can afford to. That would set history back to its status hundreds of years ago, but certainly not when I began my studies.

 

What is your favorite history-related saying? Have you come up with your own?

 

I like to say that both astrophysicists and historians can travel back in time, but only historians can do anything about the past.

 

What are you doing next?

 

There are some post-publication activities associated with Transnational Nazism. I have read numerous books but until I wrote my own, I was not aware how much work goes into publishing one. I gained much respect for book authors. I have established a presence on Twitter (@rickywlaw), mostly to discuss my book and to comment on historically relevant current events. I have felt, and still feel, rather ambivalent about the purpose of social media and its repercussions. But that’s how the world works now (I’ll leave it to future historians to assess social media’s worth), and those who know better have a responsibility to speak up and speak out when history gets misused. I am also starting to write my next book. It will analyze why and how the Japanese learned various foreign languages in the interwar and wartime years. Additionally I will develop a few new classes, on Roman, Japanese, and German histories.

 

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172469 https://historynewsnetwork.org/article/172469 0
Roundup Top 10!  

 

 

How the Declaration of Independence became a beacon to the world

by Charles Edel

The Declaration’s international reach.

 

Reflecting On The Civil Rights Act’s Anniversary With James Baldwin

by Lindsey R. Swindall

Much like the time in which Baldwin wrote, we are living through a period of deep political division and social crisis framed by global discord.

 

 

The Supreme Court Is in Danger of Again Becoming ‘the Grave of Liberty’

by Eric Foner

Supreme Court decisions have practical consequences, which justices too often blithely ignore.

 

 

An Open Letter to the Director of the US Holocaust Memorial Museum

by Omer Bartov, Doris Bergen, Andrea Orzoff, Timothy Snyder, and Anika Walke, et al.

The United States Holocaust Memorial Museum released a statement on June 24 condemning the use of Holocaust analogies.

 

 

 

The Surprising History of Nationalist Internationalism

by David Motadel

Internationalism, a concept that, after all, implicitly presumes the existence of the nation, and extreme nationalism are not necessarily incompatible. The far right is less parochial than we think — and that’s dangerous.

 

 

 

The Lingering of Loss

by Jill Lepore

My best friend left her laptop to me in her will. Twenty years later, I turned it on and began my inquest.

 

 

The False Narratives of the Fall of Rome Mapped Onto America

by Sarah E. Bond

It is disturbing to see how gravely inaccurate 19th-century depictions of the destruction of Rome are used to illustrate news stories today, particularly those that draw parallels between Rome and the United States.

 

 

Why Democrats are wrong about Trump’s politicization of the Fourth of July

by Shira Lurie

He’s not doing anything that hasn’t been done for centuries.

 

 

How Lincoln's disdain for demagogues pricks Trump's Fourth of July pomposity

by Sidney Blumenthal

If nothing else, the president’s speech on the Mall on Thursday will show how far we have fallen since Lincoln.

 

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172470 https://historynewsnetwork.org/article/172470 0
A Declaration of Independence from Hunger

The Homestead at Hot Springs, Virginia, USA, where a UN Food Conference in 1943 laid the foundation for the UN Food and Agriculture Organization. (courtesy FAO)

 

As we celebrate the Fourth of July, let's remember the greatest line from our Declaration of Independence from Great Britain in 1776: "We hold these truths to be self-evident, that all men are created equal, that they are endowed by their Creator with certain unalienable Rights, that among these are Life, Liberty and the pursuit of Happiness." These ideals of human rights for all citizens inspire us to end world hunger.  For without basic food and nutrition no person is free to build a life and reach their God given potential.  These essential rights were evident in President Franklin Roosevelt's Four Freedoms speech early in World War II, the third of which was Freedom from Want.  When FDR called for a United Nations Conference on Food during the war, it drew inspiration from the Declaration of Independence, according to Harvard Professor John Black is his book Food Enough.  The delegates stated at Hot Springs, Virginia in 1943: "This Conference, meeting in the midst of the greatest war ever waged, and in full confidence of victory, has considered world problems of food and agriculture and declares its belief that the goal of freedom from want of food, suitable and adequate for the health and strength of all peoples, can be achieved." While progress has been achieved in fighting world hunger, still too many people go to bed hungry. We must continue to fight hunger at home and abroad.  Americans donating to the Letter Carriers National Stamp Out Hunger food drive in May can feel proud of their efforts to eliminate hunger. Stamp Out Hunger donations were the third highest total ever reaching over 75 million pounds of food. That is 63 million meals donated to America's foodbanks.  Letter Carriers President Fredric Rolando (FVR) says “We are thrilled by the results of this year’s food drive and the impact it will have on helping feed those in need. And we appreciate the generosity and compassion of so many Americans.” The most donations of any postal branch was San Juan, Puerto Rico, which beat Los Angeles for the top spot. Orlando was third and Florida was the state with the most donations.  When Americans donate to foodbanks it also sends a message to our government to do their part. Congress and the President should support the TEFAP program that helps foodbanks and also the SNAP food stamp safety net for families. The Senate should pass legislation put forth by Senators Sherrod Brown and Kirsten Gillibrand to expand summer feeding.  You never know when trauma can strike and a family will need help putting food on the table.  Charity in fighting hunger may start at home, but it does not end there. Hunger is a major international crisis right now especially with the wars in Syria, Yemen, Afghanistan, and South Sudan. Relief agencies need support in the worst hunger emergencies of our time. Children suffer the most in the war zones, becoming victims of deadly malnutrition. The new Global Childhood report from Save the Children says "Many children who manage to survive in these fragile and conflict-affected settings suffer from malnutrition. Recent estimates put the number of stunted children living in conflict affected countries between 68 million and 113 million (45 to 75 percent of the global total)." The child malnutrition crisis is going to get much worse this summer with East Africa facing a major drought. The Congress should increase funding for Food for Peace, McGovern-Dole global school lunches and other aid programs.  As our experiences after World War II taught us, you cannot build peace if there is hunger and chaos.     Food is of the utmost importance to our domestic and foreign policies. Each one of us can do something about it, especially when it comes to helping charities fight hunger. You can support your local foodbank and charities that fight hunger across the globe like the World Food Program, Catholic Relief Services, Mercy Corps, Save the Children, Action Against Hunger, UNICEF and so many others. You can write letters to your representatives in Congress urging them to do more to feed the hungry.  Everyone can be a leader in declaring a new independence: Freedom from hunger for all.  

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172459 https://historynewsnetwork.org/article/172459 0
America, One and Inseparable

 

This Independence Day it is all too easy to find the theme we must adopt as we reflect on American history and identity. “America: One and Inseparable” comes to mind at once as we think on what we need to seek out and build for our greater good.

 

Those of us who have lived a long life in our great country look about at division everywhere among us.  In Congress, in many a state legislature, deep in our Society—the expression “united we stand,” once so normal, so commonplace, there for our use, seems no longer to describe us.

 

It is so very tempting to rush at the very outset to the figure of the one who has so successfully divided us and continues to do so.  He need not be mentioned, for all are all too well aware of what is happening and why this disaster has come upon us.

 

But today, I’d like to return to the document this holiday celebrates. The words enshrined in the Declaration of Independence tell us who this country is for and who our duty belongs to:  “all men” it says, and we now consider that to mean “men and women alike.” 

 

But despite those words and national mission, from birth to our passing we allow ourselves to be separated from our fellow beings!   The quiet acceptance of the First Amendment so common in earlier years seems open to debate, while that Second Amendment is in the public eye for absolutely devastating reasons, is it not so?  Our Leader seems determined to make of the word “Immigrant” something alien to us, as they with darker skin and sometimes different religion get considered quite unacceptable in the America we know and love. Violence is omnipresent —from the daily news to our entertainment.  It is a vehicle that is increasingly used to entertain, while the consequences create deeper and deeper fear.

 

Yes, the challenges we face as a nation are vast.  Guns and drugs and opioids have no place here, nor do threats that robots may take over tomorrow’s means of earning a living.  Often, it seems like compromise is impossible. As I offer these words, representatives of a political party in one state just moved out of state to avoid voting in the Legislature, denying legal representation and participation for all elected there.

 

To combat these challenges, we should listen to our former presidents, our great authors, and those who so brilliantly fill our pulpits.

 

We want to live with words like those of President Thomas Jefferson in his First Inaugural Address, President Abraham Lincoln as he faced those who suffered at Gettysburg, and even President Franklin D. Roosevelt, apolitical as he warned Americans against life governed by “fear.” 

 

We need to think on the Four Freedoms as characteristic of a free and self-governing Nation: the freedom of speech and expression, the freedom to worship God, the freedom from want, and freedom from fear. 

 

As we seek sources that may offer inspiration, we must not overlook the Archives of past Presidents of our Land.  Every time I enter the doors at West Branch, Iowa (Hoover), Abilene, Kansas (Eisenhower), Independence, Missouri (Truman), and Austin, Texas (Johnson), I get a real lift.  Two weeks at Hyde Park, N. Y. raised my spirits in 1952, and I’m sure the words of Kennedy (usually inspiring), and Nixon would inspire these days as they did long ago.  

 

 

Billy Graham in his day and others who offer spiritual guidance are also capable of raising our ideas for improving Society.  Some columnists and press writers use prose that lifts our boats (an old expression).  There is no doubt in my mind that present day dwelling on The Negative every morning with my newspaper and tuning in so often to TV News with its recital of “unpleasantness” is having an adverse effect that may prove lasting—that is, if not diluted by the uplifting.

 

Finally, in law-making it is becoming much more than essential that we COMPROMISE strong belief and frame new legislation that will serve us well.  

 

Do the ideas above seem remote from the old, that traditional 4th of July spirit that served us well in the Past?  Well they might, for at the very outset I strongly suggested that our Nation is seriously on the wrong track and needs A New Spirit. 

 

What we must have, and soon, is a renewed  affection for Unity, and respect for togetherness.  We need to breed  hope across our landscape for a Unity of thinking about a safe and noble Future for us all.  Rejecting hatred and suspicion that will surely prevent our unity in spirit and action as we move forward! Yes, and a determined and renewed affection for one another.  In that spirit, let us hope, will the day intended to build patriotic spirit serve us so we can face the Future united as a single democratic people.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172458 https://historynewsnetwork.org/article/172458 0
Gerrymandering presented a ‘political question doctrine’ deemed outside Supreme Court jurisdiction

An example of gerrymandering

 

 

In Rucho vs. Common Cause, the Supreme Court held that the question of whether partisan gerrymandering in North Carolina and Maryland violated the Constitution was a political question over which federal courts lacked jurisdiction. The result was a long time in coming but was clearly correct. As Chief Justice John Roberts ably demonstrated, there were no neutral and objective judicially manageable standards by which a federal court could resolve questions of political gerrymandering. Beyond that, however the decision may portend the much needed revitalization of the political question doctrine that the Court might apply to other legal questions as well.

 

First, here is a little background on the political question doctrine. Justice Elena Kagan’s dissent in Rucho assumes that if there is a constitutional violation, there must be a judicial remedy. However, if the political question doctrine has substance, the opposite is true. The political question doctrine holds that there are some legal questions that courts can’t resolve even if they are convinced that the legislative or executive branch – or a state institution – resolved  them in a manner that was clearly incorrect. In other words, there is no instant replay. Just as there was a consensus that referees missed a blatant instance of pass interference in the 2019 NFC Championship game (affecting the outcome and denying the New Orleans Saints a trip to the Super Bowl), so it is with law. There are some legal, often constitutional errors, that are beyond the judicial capacity and authority to correct.

 

The political question doctrine has evolved in the federal courts since the very beginning of the republic. Chief Justice John Marshall may have recognized it in Marbury vs. Madison. It had most frequently been applied in the area of foreign affairs and war but not exclusively. The great case that attempted to analyze and define the political question doctrine was Baker vs. Carr presenting the question of whether equal protection-based challenges to malapportionment of state legislatures presented non-justiciable political questions.

 

Writing for the Court, Justice William J. Brennan made a valiant attempt to deduce principles from the chaotic political question decisions that had accumulated over 150 years. He concluded the political question doctrine generally involved questions of separation of powers, but not always. He indicated six reasons why the Court had found political questions in the past. The only criteria pertinent to the reapportionment issue presented in Baker as well as the partisan gerrymandering issue raised in Rucho was the absence of judicially manageable standards. In other words, the Court’s obligation, as stated in Marbury vs. Madison, was to resolve legal disputes through the application of pre-existing law. If there were no judicially manageable standards, hence no law to apply, federal courts had no business resolving the dispute regardless of how important it might seem.

 

Applying this principle to the reapportionment dispute, Justice Brennan concluded there were judicially manageable standards since the Court was accustomed to applying the Equal Protection Clause to a variety of issues. This was one of the greatest mistakes in constitutional history. In dissent, Justice Felix Frankfurter vainly argued that there was no constitutionally mandated or discoverable benchmark for proper apportionment. He argued that political question analysis should turn on the nature of the issue at stake rather than the legal theory underlying the challenge. Political thinkers had disagreed throughout history, including American history, as to what was the best way to apportion a legislature and the Constitution itself provided no guidance. Thus there were no judicially manageable standards. The Court would simply have to choose among several contested alternatives, which it did two years later in Reynolds vs. Simms when it chose one person one vote as the appropriate constitutional benchmark. Thus Baker vs. Carr effectively diminished the role of the political question doctrine. Instead, future courts acted on the assumption that if they could figure out a way to justify their decisions no matter how unpersuasive or how lacking in constitutional pedigree, then by definition, the case was justiciable and it presented no political question.

 

However, the one area following Baker in which there was serious judicial concern as to the lack of judicially manageable standards was challenges to partisan gerrymandering. Over a period of almost 50 years, the Court heard several challenges to partisan gerrymandering but never invalidated a legislative districting plan on that ground. In 1986, in Davis vs. Bandemer, (Justice Sandra Day O’Connor writing for three justices) argued that challenges to partisan gerrymandering constituted a nonjusticiable political question due to lack of judicially manageable standards. Eighteen years later in Vieth vs. Jubelirier, Justice Antonin Scalia made the same argument, this time on behalf of four justices. However, Justice Kennedy held out hope that a judicially manageable standard might yet be discovered though he conceded that the Court had failed to find one so far.

 

The search for such a standard seemed hopeless given that the Court had long made it clear that any standard that at least implicitly assumed or led to proportional representation between political parties was forbidden. The Court did not want to be faced with a flood of challenges to redistricting plans, recognizing that the losers of elections would have an incentive to file such lawsuits. Finally, the Court had long declared that some degree of partisanship in redistricting was constitutionally appropriate. Thus the question became one of degree: “how much is too much.”

 

After five decades of searching for a standard, Chief Justice Roberts was finally able to assemble a majority that concluded “enough is enough.” There is no judicially manageable standard here. This is a non-justiciable political question.

 

The political question doctrine was the constitutionally appropriate manner to dispose of the ever increasing judicial challenges to redistricting based on partisanship. That, in itself, was a major achievement. But beyond the immediate case and issue, hopefully Ruche will lead to a revitalization of the political question doctrine after its diminishment in Baker vs. Carr 60 years earlier. As Justice John Marshall Harlan declared in his dissent in Reynolds vs. Simms, “The Constitution is not a panacea for every blot on the public welfare.” The political question doctrine aids the Court in its appropriate constitutional role of resolving disputes through the application of pre-existing legal rules as opposed to making up from whole cloth legal rules to resolve disputes. Hopefully, after Ruche, we will see more of the political question doctrine in the future. If so, it will help return the Court to its proper place in our constitutional system.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172436 https://historynewsnetwork.org/article/172436 0
12 History Podcasts You Should Be Listening To

 

Stuff You Missed in History Class

The name is pretty self-explanatory, but Stuff You Missed in History Class definitely should not be (missed, that is). Each episode features the hosts, Holly and Tracy, telling the story of a new historical event that isn’t usually covered in standard history classes. They take turns explaining the event chronologically, with some humorous commentary on the side. If you feel that your history education is lacking, or just want to know more about niche topics of history, this podcast is number 19 on Spotify’s list of top educational podcasts. It’s easy to listen to and the 30 to 45-minute episodes go very quickly. 

 

Recommended first episode: The Bone War Pt. 1 (and 2, if you like it)

Revisionist History

This weekly podcast, which sits at 21 on the US iTunes podcast charts and is hosted by journalist and author Malcolm Gladwell, tackles news topics of history that have either been overlooked or conventionally misunderstood. It typically runs from 30 to 40 minutes in length, and “asks whether we got it right the first time.” Gladwell adopts a pseudo-documentary style for each episode, which feature primary interviews and various recorded sounds that establish setting. He goes in-depth every episode and does the important job of debunking common misconceptions about the events of the past. 

 

Recommended first episode: McDonald’s Broke My Heart

Ridiculous History 

Ridiculous History is iHeart Radio’s history podcast. Its two hosts, Ben Bowlin and Noel Brown tackle new topics twice a week and “dive into some of the weirdest stories from across the span of human civilization.” The two hosts give an introduction and briefly explain the concepts they will be discussing at the beginning of each episode. Episodes sometimes feature guests, like podcasters Jack O’Brien and Miles Gray from the Daily Zeitgeist, that supplement the retelling. This podcast has some of the most unique subject matter of any of the podcasts on this list. 

 

Recommended first episode: (Some of) History’s Dumbest Military Prototypes

The History of Rome

In this series that ran continuously from 2010 to 2012, host Mike Duncan takes listeners through the complete history of the Roman Empire. The episodes are much more scripted than some of the podcasts in this list and sound like reading from the chapters of a book. It is a limited series, meaning one should listen to the episodes in order, rather than skipping around. Although it is admittedly dry, this podcast is a great in-depth exploration of one of the more famous and formidable civilizations of human history. At only 15 to 30 minutes in length for each episode, it is perfect for a morning or afternoon commute. 

 

Recommended first episode: 001 – In the Beginning

The History Chicks

The History Chicks introduces listeners to various historical female figures as hosts Beckett Graham and Susan Vollenweider discuss the challenges the figures faced and the most interesting parts of their lives. Graham and Vollenweider give a little introduction of historical background to set up the figure they then talk about. Their side commentary interspersed throughout episodes keeps listeners entertained. This podcast, posted twice a month, is on the longer side, usually running between 60 and 90 minutes.

 

Recommended first episode: Mary, Queen of Scots

Our Fake History

This podcast tackles different historical myths and commonalities that are either not completely true or sometimes completely false. Host Sebastian dramatically reads out historical accounts from newspapers, public documents, and even historians; he then goes through challenges to those accounts from eyewitness testimony or other historians. The podcast is well-researched and gives a lot of information on interesting topics. 

 

Recommended first episode: Episode 38 – Was There a Real Atlantis? (Part 1 & 2)

BBC Witness History

Witness History is a short podcast produced by BBC and describes itself as “history told by those who were there.” It covers various topics from modern history, from the war on drugs to women airplane pilots. The host is supplemented by primary audio recordings and interviews. As a result, it’s more journalistic and has a news report feel to it. There is a new episode covering a different topic every couple days and episodes only run 9 to 12 minutes, so if you’re looking for a podcast to listen to on your walk, this is it. 

 

Recommended first episode: D-Day

The Dollop

This podcast is a personal favorite. Comedians Dave Anthony and Gareth Reynolds host. In each episode, Anthony takes on one subject of American history and reads the historical account, while Reynolds reacts to hearing it for the first time. It features more commentary than some other podcasts, but it makes the educational component fun. Some of the stories they cover are just so genuinely entertaining, they almost don’t even require any commentary. 

 

Recommended first episode: 210 – The New Jersey Shark Attacks

Atlanta Monster

If true crime series interest you, this is your podcast. Host Payne Lindsey adopts an investigative journalism style as he covers the notorious Atlanta Child Murders that took place between 1979 and 1981. The podcast uses audio from news clips and is more reliant on interviews, which highlight first-person perspectives and experiences that make the podcast really interesting to listen to. Atlanta Monster is a true crime podcast, meaning it covers a single historical event in a season. 

 

Recommended first episode: S1 Ep1 – Boogeyman

BackStory

BackStory is the product of four historians at Virginia Humanities. Ed Ayers, Brian Balogh, Nathan Connolly, and Joanne Freeman take current events that people in the US are talking about and approach them from a historical perspective. They consistently choose interesting topics, like college sports, women in congress, and gambling. Episodes run from 30 to 70 minutes and the hosts do a good job of staying on topic. 

 

Recommended first episode: 276 – Red in the Stars and Stripes? A History of Socialism in America

 

In Our Time

In Our Time, produced by BBC Radio, covers older history that isn’t typically covered in the podcasts on this list, such as the Inca, Moby Dick, and the Epic of Gilgamesh. Each episode, host Melvyn Bragg gets right into the subject matter immediately and brings historical scholars on as guests to interview and offer explanations. The interviews keep things moving and offer expert analysis. This fast-paced, 40 to 60-minute podcast offers a different style as opposed to podcasts that focus on modern history. 

 

Recommended first episode: 1816, the Year Without a Summer

Past Present Podcast

Past Present tackles current political events from the perspective of professional historians. Hosted by historians Neil Young, Natalia Petrzela, and Nicole Hemmer, Editor of the Washington Post’s history section, Made by History, this podcast tries to make sense of what’s happening in the world by placing it in the context of history. It attempts to avoid partisan punditry and offers a nice alternative to current news cycles. Recent episodes cover various aspects of the 2020 election race, including Elizabeth Warren’s candidacy, Joe Biden and the 1994 Crime Bill, and tariffs.

 

Recommended first episode: Episode 184 – YouTube, Tariffs, and Elizabeth Warren

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172392 https://historynewsnetwork.org/article/172392 0
The 2020 Election and Presidential Age

 

Last week, 20 Democrats took to the debate stage over the course of two nights in hopes of becoming the 2020 Democratic nominee for president. On the second night in particular, the age gap between the candidates was striking. Joe Biden, 76, and Bernie Sanders, 77, shared the platform with Pete Buttigieg, 37. 

 

America has had 44 men serve in the Presidency. Theodore Roosevelt, 42 years old at his inauguration, was the youngest president and Donald Trump, 70 at inauguration, is the oldest. The average age for presidents at inauguration is slightly over 55 years old. 11 presidents were 60 or older; 24 were in their 50s; and nine were in their 40s at inauguration. 

 

For those who were 60 years and older, several have had health issues while in office.  Two out of 11 died in office: William Henry Harrison and Zachary Taylor. Ronald Reagan displayed signs of aging as many believed he was in the early stages of dementia or Alzheimers. Dwight D. Eisenhower suffered a massive heart attack while in office. 

 

Several of the presidents who were elected in their 60s struggled to effectively lead. Two  of these Presidents, John Adams and George H. W. Bush, could not win reelection, and Gerald Ford, who became president after Richard Nixon’s resignation, was unable to win a full term in the White House. Only three Presidents who served in their 60s and beyond--Harry Truman, Dwight D. Eisenhower, and Ronald Reagan--had what were regarded as outstanding administrations, making the top ten list of presidents in just about any scholarly poll.

 

Most of the remaining top 10 presidents were in their 50s when taking office (George Washington, Thomas Jefferson, Abraham Lincoln, Franklin D. Roosevelt, and Lyndon B. Johnson), with the exception of Theodore Roosevelt and John F. Kennedy, who were in their 40s. 

 

Several candidates would raise the average age of presidents based on their age on the day they would be inaugurated: Bernie Sanders (79), Joe Biden (78), Elizabeth Warren (71), Jay Inslee (68), John Hickenlooper (67); and Amy Klobuchar (60). At the same time, several potential nominees in their 50s would be consistent with the average age of presidents: (from oldest to youngest) Bill de Blasio, John Delaney, Michael Bennet, Kamala Harris, Kirsten Gillibrand, Steve Bullock, and  Cory Booker. The potential presidents who would be in their 40s on Inauguration Day would lower the average presidential age: (from oldest to youngest) Beto O’Rourke, Tim Ryan, Julian Castro, Seth Moulton, Eric Swalwell, and Tulsi Gabbard. Finally, Indiana Mayor Pete Buttigieg, who would be only 39 years and one day old on Inauguration Day 2021, would be nearly four years younger than Theodore Roosevelt and four years and eight months younger than John F. Kennedy. Moulton, Swallwell, and Gabbard would also be younger than TR or JFK, but older than Buttigieg.

 

So the potential exists that we could have the oldest President in American history at inauguration with Sanders, Biden or Warren, or the youngest President in American history with Buttigieg, Gabbard, Swalwell or Moulton. If any of these seven take the oath, they will either raise the average age of American Presidents, or lower the age dramatically.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172434 https://historynewsnetwork.org/article/172434 0
When the Future is the Past: A ‘Star Wars’ Summer at Tanglewood

 

This Sunday afternoon, the star fleet from the evil Empire, the storm troopers of Darth Vader and the nasty masters of the Death Star will once again battle the good guys – Han Solo, Luke  Skywalker, Chewbacca, 3-CPO, R2-D2 and Princess Leia. This time the epic conflict in the film Star Wars will be at, of all places, the renown Tanglewood music center in Lenox, Massachusetts, as part of a three-concert film tribute to music composer John Williams.

 

The Tanglewood series resumes August 16 with the screening of Star Wars: A New Hope, with Keith Lockhart conducting the orchestra, and concludes August 24 with Williams back again for another concert of his different film music.

 

Star Wars represents the future in the past. It is about a galaxy in a far-off time in the future but the film itself is already 42 years old, an historic gem. It is entertainment history, Yoda and all, at its best.

 

Films accompanied by live music by famous orchestras is a craze across the nation, and Tanglewood was one of the first in the field, starting back in the 1990s.  Their films have even included Alfred Hitchcock’s Psycho.

 

Why Williams and Star Wars this summer? Could anything be more unclassical for this world renown home of classical music than Han Solo, Luke and Chewie zipping through the galaxy in the Millennium Falcon, blasting away at the bad guys?

 

“We are devoted to great music, and everyone, just everyone, puts the theme to Star Wars, among the top pieces of music ever written,” said Dennis Alves, the director of artistic Planning at Tanglewood.  “People hum the Star Wars theme everywhere. It is great music, plain and simple, and people just love it.”

 

Alves, who has to book every kind of act at Tanglewood, from the Venice Baroque Orchestra to James Taylor, has a formula for deciding which movies will be screened with his orchestra playing the music at the Lenox music center in summer and at the Boston Pops home in winter in Boston, too. 

 

“You want to pick a very popular movie, something that appeals to all or is remembered by all, because you want to draw as many people and, frankly, they sell tickets. You want to pick a genre of film that is beloved, such as science fiction or adventure or comedy films. We did a Bugs Bunny tribute that was wildly successful. You want to select a film that will please teenagers as well as adults, too. And men and women,” he said.

 

To him, Star Wars is one of those films. “They are so popular that when each one comes out, I wait several weeks to see it to avoid the crush of the big crowds at the theaters,” said Alves.

 

John Williams is one if America’s greatest film music composers. He has won five Oscars and been nominated for 51 (second only to Walt Disney). Among his films are all of the Stars Wars movies, Close Encounters of the Third Kind, the first three Harry Potter films, the first two Jurassic Park films, Home Alone, Superman, and E.T. He even composed the music for the first season of the television series Gilligan’s Island. He last appeared at Tanglewood in the summer of 2017. Back in the ‘90s he began a long association with Tanglewood and has conducted there dozens of times.

 

“He’s a fan favorite,” said Alves, who added that the type of movies Williams scores are exactly the kind of films the music center wants to show.

 

Tanglewood executives claim that another reason they do film concerts is that they draw a very different audience from their standard classic offerings. “Star Wars is the perfect example of that. We’ll get thousands of teenagers each summer for those concerts. We hope that these teenagers, who ordinarily would not come here, will come back later, or bring their own kids back to see the classical works.”

 

Another reason for the use of Star Wars is the movie’s cult following. “There are millions of Star Wars fans and there are a number of different Star Wars movies. One time when Williams was doing one of his concerts here people in the audience yelled that they wanted more songs in an encore. What did he choose? The orchestra did music from The Empire Strikes Back. The place simply went crazy. They loved it,” said Alves.

 

The setting for the movie concerts at Tanglewood is beautiful. The music tent, or “shed,” and nearby Ozawa Hall, sit on the sprawling lawns carved out of a forest in the hills of Lenox, in Western Massachusetts, three hours from both New York and Boston. In addition to the 5,000 some seats under the tent, hundreds of music lovers sit on the wide lawns beyond the tent. People often arrive on mornings and picnic on the lawns while listening to the Boston Pops practice in a quiet world far from the maddening crowds.

 

Tanglewood is certainly not unique. For several years, the New York Philharmonic has added several movie/concerts to their schedule. Last spring, it presented a two-and-a-half-hour concert of Bugs Bunny cartoons with the orchestra playing the music and David Geffen Hall at Lincoln Center was completely sold out. Best of all, everybody had the chance to have their picture taken with Bugs Bunny himself (he was not chewing a carrot, though). The audience there roared at all the cartoons and applauded madly at the end of the films, as audiences do across the country in these movie concerts.

 

So, on Sunday, and on August 16 and 24, Tanglewood will be packed with movie fans. Their enthusiasm will be with them. Their energy will be with them and. most importantly, the Force will be with them, too.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172435 https://historynewsnetwork.org/article/172435 0
Adding a Citizenship Question to the Census Will Return It to Its Racist Origins

 

For most Americans, the census is something none of us really think about, for good reason. It doesn’t come around but once every ten years, and then it’s gone again. Besides a few stories on the changing demographics of the country, and possible changes in the number of representatives any given state is allotted, the census is usually in the back of many people’s minds. 

 

That, however, changed with the arrival of the Trump administration. Seemingly out of nowhere it was announced the 2020 Census would be modified in a simultaneously subtle and monumental fashion. Wilbur Ross, the Secretary of Commerce, the division of the government which oversees the Census Bureau, announced in 2018 the next census would include a question asking for respondents’ citizenship status. Almost immediately the potential question was met with a bevy of criticism, support, and lawsuits. 

 

Those lawsuits have culminated this week in the Supreme Court handing down a decision in the citizenship census case. The Court essentially held the government’s reasons for adding the question where inadequate at best, and lies at worse. However, the question is not yet settled. The Court explicitly left the door open for the government to come back with better reasons. It should be remembered the Court has shown it is more than willing to look past the Trump administration’s stated reasons for the actions it takes, and fabricate constitutionally legitimate reasons in order to uphold government actions. Trump has now stated he wants to delay the 2020 Census until the Court reconsiders (and submits to) this administration’s demand for a citizenship question. 

 

What may seem to some as a fairly innocuous question could actually have quite drastic consequences for what our country looks like for the next ten years. By adding the question, the administration hopes to scare non-citizens into not filling out the census. The logic being non-citizens, whatever their immigrant status, would be too intimidated to announce themselves to an administration which has made anti-immigrant ideology a central plank of its governance. States with heavy immigrant populations, which also coincidentally happen to be strongholds for the Democratic Party, would likely lose seats in Congress, funding, and a host of other things as a result of their “official” population lowering. The fear, by many immigrants, is certainly well founded, considering the President of the United States opened his campaign with violently racist remarks against immigrants, and his administration has made it a point to manufacture a humanitarian crisis on the southern border.

 

Many Americans might ask: What’s the big deal? The census is meant to count the number of American citizens there are in the country, right? Wrong. 

 

Nowhere in the Constitution does it say the census should count the number of citizens. Instead, the Constitution goes for a much broader category: “persons.” It was no mistake either. Instead, the inclusion of persons, or people, in the Constitution was the result of a deliberate concession to slaveholders made during the Constitutional Convention. 

 

Almost from the outset of the Constitutional Convention, when delegates began debating how representation would be decided for what would become the House of Representatives, a major sectional divide emerged among slaveholding and non-slaveholding members. Simply put, those who enslaved black Americans thought their slaves should count towards representation, while those who did not thought the opposite. Non-slaveholding Northerners reasoned that by counting enslaved people the South would gain a disproportionate amount of power in the new national government. If enslaved people could not formally be apart of the society, then why should they count towards Southern representations?

 

Eventually, after many debates, much anger, and several threats, the delegates finally decided on a compromise: representation would be based on “adding to the whole Number of free Persons, including those bound to Service for a Term of Years, and excluding Indians not taxed, three-fifths of all other Persons.” The delegates decided on this language quite purposefully. During a time when it was not expected that most people would be citizens, much less would be allowed to be citizens, counting people was the best route towards compromise. It was the only way to count enslaved people, in any fashion, while holding onto some semblance of enslaved people’s debased statusunder the institution of chattel slavery. Imagine if the Constitution said “three-fifths of all other ‘citizens’” would count towards representation when speaking of enslaved people. 

 

The compromise paid dividends for Southerners. The Three-Fifths Clause allowed Southern slaveholders to possess far more political power in the national government thanthey should have, compared to their largely free Northern counterparts. From the adoption of the Constitution until the beginning of the Civil War, Southerners would enjoy a padded population number under the census, at the expense of their enslaved population. It wouldn’t take long, either, for the investment to pay off. Thomas Jefferson, for example, would have never been elected president in 1800 had it not been for Three-Fifths Clause, and the census which helped to actualize it. 

 

Now, over 200 years after the adoption of the Constitution, over 150 years after Emancipation, conservatives wish to return the census to its racist origins. By adding a citizenship question, the census would once again be used to misappropriate the political power in the country. Once deemed the best way to grab political power, the use of the word “people” in the Constitution now poses a threat to those who struggle to hold onto power. The Trump administration is now attempting to do exactly what slaveholders did in the eighteenth-century: use the Constitution to inflate the political power of a vocal minority. 

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172406 https://historynewsnetwork.org/article/172406 0
What to the Refugee Child is the Fourth of July?

 

What to the refugee child is the Fourth of July? How will so-called Independence Day be celebrated by the children whose families are escaping violence in Central America, only to be placed in the concentration camps of the Ursula Detention Facility, the Port Isabel Detention Center, Fort Sill where Japanese-Americans were once unjustly imprisoned, or the Tornillo tent city? For there is something obscene in the fireworks this year, something sacrilegious in singing the Star-Spangled Banner or God Bless America, something profane about the endless bromides concerning the wisdom of our founders. Something nauseating in the orchestral and pyrotechnic spectacle over the National Mall, presided over by a man elected by a minority of voters, who panders to his base with the abject inhumanity of family separation at the border, of denying migrant children – children – the necessities of basic hygiene. 

 

Something heretical in valorizing a nation led by a man who crowds over a thousand people into concrete spaces designed for a hundred, where the cost of imprisoning a child is $750 a day but Justice Department attorney Sarah Fabian can claim before a judge that its “safe and sanitary” to deny people showers, soap, and toothpaste. How can you enjoy your burgers and hotdogs, baseball games, picnics with your family, when children are given water tainted with bleach and forced by ICE agents to defecate on themselves in concentration camps run by for-profit prison companies supported by your tax dollars? What to the refugee is the Fourth of July? Bitterness, ash, wind, and empty promises. What to the child imprisoned in an American concentration camp is Independence Day? Cruelty and lies.

 

So if you’ve avoided accounts from human rights lawyers about physically and sexually abused children, about parents and their babies drowning in the Rio Grande, about the spread of disease among a captive population who’re only imprisoned because they dared apply for asylum after American foreign policy had made their own nations unlivable, then I implore you to look. Civil rights attorney Warren Binford recounted to Isaac Chotiner of The New Yorker what she witnessed at one Texas detention center, where there was a lice outbreak in a cell that held 25 children. Binford says that the children “were given a lice shampoo, and the other children were given two combs and told to share those two combs… and brush their hair with the same combs.” After the children – children – lost the lice comps, the “Border Patrol agents got so mad that they took away the children’s blankets and mats. They weren’t allowed to sleep on the beds and they had to sleep on the floor” as punishment. Dara Lind of Vox reports on the “dangerous overcrowding” of these camps, describing instances where “up to 900 people being held in a facility designed to hold 125.” 

 

Physician Lucy Servier visited the Ursula detention center after a flu outbreak sent five infants to a Neonatal unit; her first-hard report was quoted by Serena Marshall, Lana Zak, and Jennifer Metz of ABC News. Servier witnessed children kept in “extreme cold temperatures, lights on 24 hours a day, no adequate access to medical care, basic sanitation, water, or adequate food.” The imprisoned children are not allowed to wash their hands, which Servier said was “tantamount to intentionally causing the spread of disease.” Adolfo Flores of BuzzFeed News reports that “Breastfeeding mothers detained by US Border Patrol are only receiving half the amount of water they need, and hungry babies are not getting enough food.” Binford is also quoted by ABC News, the authors writing that conditions at the Clint facility “included infants and toddlers sleeping on concrete floors… guards punishing the children by taking away sleeping mats and blankets, and guards creating a ‘child boss’ to help keep the other kids in line.” Servier concludes by saying that the “conditions within which they are held could be compared to torture facilities.” What else are we to call this? 

 

In this piece, I’ve borrowed my initial rhetoric from the justly celebrated July Fourth Oration of Frederick Douglass, delivered the day after the holiday in 1852 to the Rochester, New York Ladies’ Anti-Slavery Society. Historian David W. Blight writes about the author in Frederick Douglass: Prophet of Freedom, that he “constantly probed the ironies of America’s contradictions over slavery and race,” and rarely is that more spectacularly the case than in his 1852 address. Douglas didn’t equivocate, and neither should we. If you’ve been following the news from the border, you’ve no doubt read the debates over the terminology to describe these crimes. In a tweet earlier this month, Democratic New York representative Alexandria Ocasio-Cortez maintained – accurately – that the United States government was “running concentration camps” at the border. While she found supporters for her position (including scholars on the subject), Ocasio-Cortez was also predictably lambasted by both Democrats and Republicans, with Liz Cheney, Republican Wyoming representative, and daughter of the author of the Bush administration’s torture policy, sanctimoniously declaring that her colleague should “spend just a few minutes learning some actual history.” 

 

So let’s do that. Concentration camps are not synonymous with Nazi extermination camps, and as a punitive method their use can be traced back to the British in the Boer War. The term is appropriate to describe the U.S. internment of Japanese-Americans during the Second World War, and any number of other instances as well. This might not be the “actual history” Cheney had in mind, though one imagines that once you’ve reached the level of defense which maintains that you’re at least not as bad as Adolf Hitler, perhaps policy has gotten out of hand. As Holocaust scholar Waitman Wade Boern explained to Jack Holmes of Esquire, “Things can be concentration camps without being Dachau or Auschwitz. Concentration camps… at the most basic level… [exist] to separate one group of people from another group. Usually, because the majority group… deem the people they’re putting in to be dangerous or undesirable in some way.” In that same article, historian Andrea Pitzer, author of One Long Night: A Global History of Concentration Camps, is quoted as saying “We have what I would call a concentration camp system.” Important to remember that the Nazi regime opened the gates of their first concentration camps in 1933; they weren’t converted to extermination camps until 1941. If “Never Again” is to have any meaning, it’s imperative to make sure that we never close that gap of eight years. 

 

Masha Gessen, exiled Russian dissident and activist against Vladimir Putin’s regime, analyzes the implications of Holocaust metaphors in her excellent essay in The New Yorker titled “The Unimaginable Reality of American Concentration Camps.” Gessen explains that the faux-outrage about Ocasio-Cortez’s tweet can be contextualized by understanding that we currently face a “choice between thinking that whatever is happening in reality is, by definition, acceptable, and thinking that some actual events in our current reality are fundamentally incompatible with our concept of ourselves – not just as Americans but as human beings – and therefore unimaginable.” Just as Douglas did not equivocate, so too the critics of this evil system at our border must not. Douglas didn’t avoid offending “moderates” or “centrists,” in the eyes of the former enslaved man the conduct of the nation is “hideous and revolting,” America is deserving of a “fiery stream of biting ridicule, blasting reproach, withering sarcasm, and stern rebuke,” a land of “great sin and shame.” Perhaps Cheney is chaffed by the truth, but Douglas would remind her that what is needed is “the storm, the whirlwind, and the earthquake. The feeling of the nation must be quickened; the conscience of the nation must be roused; the propriety of the nation must be startled… and its crimes against God and man must be proclaimed and denounced.” Analyze the rhetoric, the grammar, the syntax, the diction of Douglas’ address, and ask yourself if one iota should be changed in 2019?

 

A common question that teachers will ask students when they learn about slavery or the Holocaust is “What would you have done if you were alive then?” While undoubtedly must of us answered that we’d resist or help, in this summer we now have more honest answers. If you ever wondered how you’d react, if you ever wondered how others would justify such malignancy – now you know. There is a grotesque segment of people fine with the fact that infants are forced to soil themselves and children are made to sleep on cold concrete. These people deny that it’s really happening, or pretend it’s not that bad, or most heinously they simply don’t care since it’s not their children, or they even glory in such atrocity because the victims have a different skin color, or a different religion. Then there are those brave women and men of conscience, people like the human rights attorneys and doctors who are advocating to abolish this wicked system, or the workers at the Wayfair company who went on strike rather than make furniture to supply the regime. And then there are the rest of us, those of us horrified, disturbed, and sickened, but too cowardly, tired, or paralyzed to do anything other than share articles like this one. 

 

German philosopher Hannah Arendt writes in her 1978 Life of the Mind that the “sad truth of the matter is that most evil is done by people who never made up their minds to be or do either evil or good.” If we’re cowardly, we must steel ourselves; if we’re tired, we must wake ourselves; if we’re paralyzed, we must begin to move ourselves. The good news is that it is no single person’s responsibility to bear witness; it is no solitary woman or man who will end this evil single-handedly – and there are things that you can do right now. You can take part in the July 12th “Lights for Liberty: A Vigil to End Human Detention Camps,” which is holding events at the actual sites and other locations around the country. More immediately, you can donate money, time, and resources to organizations on the ground who are working to provide legal consultation to the imprisoned and to help women and men make bail. There are several worthy organizations, but you might consider starting with the American Civil Liberties Union that has a dedicated section of their website detailing ways to help on specifically this issue, you can assist Lawyers for Good Government, or you can contribute to RAICES, a refugee aid project founded by community activists in south Texas. Most of all, you must not look away. Douglass demanded of his audience that they look, and we must do the same. Once we do, what do we see?  

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172396 https://historynewsnetwork.org/article/172396 0
Independence Day and the Importance of Public Service in the Face of Regnant Tyranny

 

Independence Day, the 4th of July, is a day for, shall we say, celebratory reflection. It is a day not for chest-thumping martial ardor and jingoism, but for honoring and giving renewed life to the ideas that made the Declaration of Independence what it is: an “expression of the American mind,” in Thomas Jefferson’s words, as well as the philosophical foundation for our idealized way of life. It also serves as a reminder that America’s founders, whose wisdom and courage we extol, were dissidents – revolutionaries– who acted treasonously to rid themselves of legally constituted government that nonetheless denied its subjects the fundamental rights all humans deserve to enjoy. It is both the right and the duty of those who suffer such despotism, our founders declared, to dissent against – even to overthrow – the government in power.

 

The man some consider the “Father of the American Revolution,” Thomas Paine, spoke forcefully and famously to this conjunction of tyranny and dissent in December 1776: “These are the times that try men’s souls. The summer soldier and the sunshine patriot will, in this crisis, shrink from the service of their country; but he that stands by it now, deserves the love and thanks of man and woman. Tyranny, like hell, is not easily conquered.” In this, the first of his The American Crisis pamphlets, Paine sought to inspire his fellow Americans still at war with their British oppressors.

 

Today, we are at war with ourselves, riven as never before in recent memory with division, polarization, hatred, and alienation, our most cherished traditions under attack, precipitated and accentuated by massive failures of leadership at the highest levels of our government – a representative democracy, conceived as superior to both autocracy and direct democracy, and predicated on the (flawed) premise that the best of us govern the rest of us.

 

For those in government – elected officials, political appointees, and government workers alike, all public servants charged with fulfilling the imperative of government of the people, by the people, for the people – these are times calling for the highest order of courage, integrity, and honor. As public servants, we have all, in accordance with the law (5 USC 3331), sworn an oath of allegiance to a Constitution that owes its intellectual antecedents to the Declaration. It is to this Oath that we must return today for inspiration, guidance, and the imperative for action.

 

Here, lest we forget, are the principal words of that Oath:

 

“I _____ do solemnly swear (or affirm) that I will support and defend the Constitution of the United States against all enemies foreign and domestic [and] that I will bear true faith and allegiance to the same. . . .” 

 

Some of us may remember comedian Red Skelton’s 1969 ode to the Pledge of Allegiance on CBS television. Skelton told the TV audience how the principal of his school in Vincennes, Indiana, Mr. Lasswell, had explained the meaning of the Pledge to students he thought had become complacent and bored. 

 

Skelton’s soliloquy – in which he clarified or interpreted each and every word of the Pledge – was  extraordinarily powerful, not simply because of the content of his words butequally because of his delivery: a nationally recognized, slapstick comedian suddenly gone deadly serious as a response to the turbulence and upheaval of the times. By way of abbreviated example, here’s how he started: 

 

I– me, an individual, a committee of one.

Pledge– dedicate all of my worldly goods to give without self-pity.

Allegiance– my love and my devotion.

To the Flag– our standard, Old Glory, a symbol of freedom. Wherever she waves, there is respect because your loyalty has given her a dignity that shouts freedom is everybody's job. . . . 

 

I lived through the turbulence and upheaval of that time, accentuated and capped off by Watergate. Not since, until now, have I had to endure a test of the American way of life and our form of government to rival the Watergate experience. Considering, though, that such a crisis – a Constitutional crisis, a crisis of democratic governance – is now upon us, I consider it perfectly justifiable to appropriate Skelton’s rhetorical method as a vehicle for clarifying the meaning of the Oath and to remind those who have taken it what their duty is. So, here’s my presumptuous attempt at exegesis: 

 

I– me, a sovereign individual, a sentient human being with my own mind, capable of transcending social conditioning and self-blinding ideological indoctrination to exercise free will, unfettered and unassisted, fully responsible and accountable for my own thoughts, words, and deeds. 

 

do solemnly– with utmost seriousness and commitment, devoid of frivolous or cavalier intent or demeanor.

 

swear (or affirm)– officially express myself to all who would hear, my word as my bond, my civic sacrament.

 

that I will– me, alone, by myself, commit henceforth to taking constructive, purposeful action informed by thoughtful understanding, not mindless dogma.

 

support and defend– unconditionally, unreservedly stand behind and stand for, reaffirm, uphold, and protect.

 

the Constitution of the United States– the supreme law of the land, America’s civic gospel, to include the organizational arrangements, authorities, processes, principles, values, rights, and responsibilities embodied in that seminal expression of what this country – our more perfect union – should stand for.

 

against all enemies– to counter any party, tyrannical, despotic, or antagonistic, whose actions actually threaten, undermine, or usurp the foregoing; sensitive all the while to the possibility that sometimes – oftentimes even – the enemy we meet is us.

 

foreign and domestic;– whether inside or outside the country, inside or outside government, inside or outside our chosen community of like-minded associates.

 

[and] that I will bear– again, I alone, undertake to shoulder personal responsibility and be accountable for so doing, regardless of consequence.

 

true faith and allegiance– not false, manufactured, contrived, or self-serving, but bona fide, critically understood and embraced, unconditional belief and loyalty.

 

to the same– the Constitution, with all the values, precepts, authorities, and imperatives expressed – and implied – therein.

 

To anyone who has ever sought to soar like an eagle when surrounded by turkeys; who has given in to or been forced to live by the mushroom principle by being kept in the dark and fed the social or political equivalent of manure; who has learned the hard way (or even by accident) how hard, if not impossible, it is to turn chicken droppings into chicken salad, it isn’t hard to recognize something of the challenge before all public servants – certainly those who accept the proposition that we have enduring civic obligations, grounded in the Oath we have taken, that demand to be met – now, perhaps as never before. 

 

Those who have sworn this oath of allegiance to the Constitution would be well advised, in fact, to take heart – in a double twist of irony, I might add – in one of the most famous statements commonly and erroneously attributed to super-patriot, anti-federalist, Constitutional opponent Patrick Henry: “The Constitution is not an instrument for the government to restrain the people. It is an instrument for the people to restrain the government – lest it come to dominate our lives and interests.” What more compelling words could there possibly be to remind us of our obligations to office, to The People, and to ourselves? 

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172403 https://historynewsnetwork.org/article/172403 0
Colin Powell’s Recent “Lincoln Medal” Disregards A Checkered Past

 

Last week, Colin Powell was bestowed with yet another medal, the Lincoln Medal, for his storied public service. Among his myriad previous awards are two Presidential Medals of Freedom, a Congressional Gold Medal, the Liberty Medal, the NAACP’s Spingarn Medal, the Woodrow Wilson Award for Public Service, and the Ronald Reagan Freedom Award. 

 

The Lincoln Medal from the Ford Theater Society is awarded to people whose accomplishments and personal attributes exemplify “the lasting legacy and mettle” of Abraham Lincoln. At the recent award gala, President Donald Trump—whom Powell has called “a national disgrace” —proclaimed that the retired Army general was an American “to really look up to” and “a man I have a lot of respect for.”     

 

Powell, the former secretary of state and chairman of Joint Chiefs of Staff, certainly deserves plaudits for his four decades of government service. As Powell’s most recent biographer, I have argued that when he became George H. W. Bush’s principal military advisor, he was indeed an exemplary public servant and the consummate subordinate—dedicated, competent, thoughtful, honorable, and independent. Powell not only supported President Bush well, but he also challenged him when he believed that the commander-in-chief was heading towards a bad decision. 

 

The Persian Gulf War provides three examples of Powell’s superior service. 

 

First, when Powell thought Bush was moving too quickly towards war, he spoke out, warning the president that war with Iraq, which then occupied Kuwait, “would be the NFL” and “not a scrimmage” akin to the recent victorious U.S. war in Panama. Instead, Powell advocated a “strangulation” policy of sanctions and containment. In brief, the famously “reluctant general” provided the president with a well-considered option short of warfare. And, when the president decided that war was necessary, Powell rightfully saluted his superior and prepared for the coming conflict.

 

Another example of the chairman’s excellent followership came soon thereafter when Bush expressed his desire to prosecute the war primarily through air power. Powell again cautioned the president, providing alternative advice. The general asserted that a ground war was necessary to guarantee victory, and that he needed 500,000 troops to accomplish the mission. This time around, Bush embraced his subordinate’s expert counsel.      

 

After a relentless aerial bombing campaign and only three days of ground warfare, the Iraqi army began fleeing Kuwait. Having achieved the principal objective of liberating Kuwait, Powell worried about the losing “the high moral ground,” and he warned Bush, “We don’t want to be seen as killing for the sake of killing.” The president again accepted Powell’s advice and suspended the offensive campaign. This decision, writes military historian Rick Atkinson, “was a rare triumph for ther better angels of our nature.”     

 

Powell’s outstanding service as chairman of the Joint Chiefs of Staff notwithstanding, the perpetual lionizing of the general has had the effect of misbalancing the historical record. Despite of—or because of—his popularity and patriotic achievements, Powell has not been held to account for some of his major failings as a public servant. The Iran-Contra scandal and the treatment of Afghanistan War detainees are but two examples.

 

Like others in the Reagan administration, Powell’s involvement in the Iran-Contra scandal was less than honorable. 

 

There is no doubt that Powell, then an Army major general and a most loyal assistant to Defense Secretary Caspar Weinberger, was aware of Reagan’s illicit program of selling arms to Iran, a U.S.-designated terrorist state, in hopes of recovering American hostages in Lebanon. 

 

It is also clear that Reagan, his senior staff, and Powell, understood that the Iran program, which at first included selling the weapons through Israel, was illegal. At one point, Reagan informed his advisers that he would risk going to prison because the American people would want him to break the law if it meant saving the lives of hostages. “They can impeach me if they want,” Reagan quipped, “visiting days are Wednesday.”

 

 

Powell understood that these transfers of American-made missiles mandated Congressional notification. As a senior government official who swore an oath to support and defend the Constitution, Powell possessed a duty to report the Iranian transactions to army leadership, congressional leadership, or the Justice Department. He did not.

 

To make matters worse, Powell wittingly participated in the subsequent cover-up of the Iranian operation, and in the process, he obstructed justice by deceiving and misleading “out-of-control” federal investigators to protect himself and his superiors. In 1992, a grand jury indicted Caspar Weinberger on five felony charges for lying to Congress and obstructing federal investigations. Just weeks before his trial, however, he received a presidential pardon, one which Powell had lobbied for.

 

A decade later, Powell, now secretary of state for George W. Bush, faced another major ethical dilemma, how to treat detainees captured during the Afghanistan War. 

 

As a combat veteran with a vested interest in the Geneva Conventions, it is disheartening that Powell did not champion minimum standards of human treatment for all war captives. While the secretary did favor giving prisoners “due process” to determine if they warranted POW protections, he never seriously supported giving the detainees ironclad Geneva protections from physical and psychological abuse.  

 

Powell, furthermore, did not object to the CIA’s clandestine program of rendition and torture, which Senator John McCain condemned as “one of the darkest chapters in American history.” Again, to make matters worse, Powell gave the false impression in public that he opposed the mistreatment of all captives, including so-called unlawful enemy combatants. For years into retirement, Powell perpetuated the myth of his opposition to detainee abuse, falsely telling CNBC’s Rachel Maddow, “We had no meetings on torture” and “It was always the case, at least from the State Department’s standpoint, we should be consistent with the requirements of the Geneva Convention.” 

 

Colin Powell remains one of the most admired and respected people in the United States, and there are just reasons for both his enduring popularity and the ongoing feting. But, as with America’s most revered presidents, Washington and Lincoln, Powell proved a fallible patriot, who, in the course of a long and distinguished public career, made some grave and consequential errors in judgment. While those blunders do not erase the significance of his achievements and service, they are failures nonetheless, and they should not go untold. 

 

 

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172400 https://historynewsnetwork.org/article/172400 0
What We Can Learn About Tough Times and Problem-Solving from the 1970s

Lava lamps, popularized in the 1970s

 

Yes, my friends, times are hard. We’re neck deep in troubles abroad and at home, from major-league agita with Iran, Russia, China and North Korea to pop-up trade wars with U.S. allies and seemingly insoluble concerns like immigration, opioids and guns. Our poor, polarized national psyche struggles to keep up.

 

Maybe this is a good time to pause and remind ourselves that “hard” is a relative term. Take the 1970s, for example.

 

Short of world war, it’s hard to top the disco decade for wrenching events: America’s agonizing retreat from Vietnam, Watergate’s takedown of Richard Nixon, the Iran hostage crisis, nuclear meltdown at Three Mile Island, and energy crises that sent the economy skidding into stagflation, as prices shot the moon and growth locked itself in the cellar. Such reversals of fortune left the country disoriented and depressed.

 

Not that we’ve got it a whole lot easier today; in some ways, our problems may be even less tractable, given rambunctious leadership in the White House and the country’s at-your-throat political divide. So, it’s reasonable to assume an iconic voice from the seventies, iconoclastic economist and thinker E.F. Schumacher, might have some advice for us.

 

Schumacher, a British citizen of German birth, achieved cult-like status, especially among the young and high-minded, after the 1973 publication of his international bestseller, “Small Is Beautiful: A Study of Economics as if People Mattered.” A manifesto for economics with a heart, it called for moving from profit-driven to people-oriented policies; from environmentally degrading industry to the intelligently husbanding; from outsized economies of scale to scaled-down models employing technology “appropriate” to local communities.

 

Little wonder Schumacher’s recipes for humane overhaul struggled for traction in a corporate world bent on proliferating bigness. Yet his ideas (Schumacher died in 1977) have continued to exert an antiestablishment appeal, their echoes resonating today in the policy positions of progressives like Bernie Sanders, Elizabeth Warren and Alexandra Ocasio-Cortez, and in some libertarian circles, as well.

 

But it’s Schumacher’s 1977 book, “A Guide for the Perplexed” that zeros in on the role problems play in human life, how and where knowledge comes into play, and how to enlist the better angels of our nature in making headway against our thorniest challenges.

  

In Schumacher diagnosis, problems come in two flavors. The first type is preponderantly technical in nature—how to design that flying car, for example, or perfect a life-saving cancer vaccine. Solutions involve hard, smart work: developing smaller, more efficient energy sources, maybe, or fine-tuning gene-splicing techniques. Interested parties work the puzzle together or apart, Schumacher says, until answers “converge” and “a design emerges which is ‘the answer.’” 

 

The second category presents the biggest headaches. That’s because these problems contain elements that defy purely rational answers, freighted as they are with basic value judgements, contrasting moral perspectives or religious beliefs. The result is the sort of strong conflicting emotions that can stop progress in its tracks. Think abortion, climate change, and gun control.

 

The problem with these problems, Schumacher points out, is they “do not converge.” In fact, “the more they are clarified and logically developed, the more they diverge, until” the lines harden and opposing tribes go to war, by whatever means. “Logic does not help us,” the author writes, “because it insists that if a thing is true its opposite cannot be true at the same time.” A divergent problem “does not yield to ordinary, ‘straight-line’ logic; it demonstrates that life is bigger than logic.”

 

There is no “correct formula” for divergent problems, Schumacher writes; they have to be “transcended.” 

 

“A pair of opposites—like freedom and order—are opposites at the level of ordinary life,” the theory runs, “but they cease to be opposites at the higher level, the really human level, where self-awareness plays its proper role. It is then that such higher forces as love and compassion, understanding and empathy, become available, not simply as occasional impulses … but as a regular and reliable resource.” At that point, Schumacher says, “Opposites cease to be opposites.” 

 

How quaint that sounds today! In a world of weaponized tweets and jaded with digitally driven entertainments, we’re more likely to check our social-media feeds or engage our Netflix fix than resort to over-the-horizon thinking. Very much in the zeitgeist of the 1970s, Schumacher argued that tackling divergent problems was not merely a means for solving humanity’s pressing puzzles; it marked a path to self-development, open-mindedness and inclusion.

 

“Divergent problems,” Schumacher writes in “Guide,” “offend the logical mind, which wishes to remove tension by coming down on one side or the other, but they provoke, stimulate, and sharpen the higher human faculties, without which man is nothing but a clever animal.”

 

Opposing tensions helped Schumacher define his eclectic career. In the 1930s, he chose England and a Rhodes Scholarship over living in Hitler’s Germany. At Oxford, he studied under John Maynard Keynes, and, in 1950, became chief advisor to the British National Coal Board. Ever the polymath, Schumacher augmented his training in orthodox economics by cultivating interests in Buddhism, Indian philosophy and Catholic thought, elements on which he draws in “Guide.”

 

For all that, Schumacher was a theorist, not a soothsayer. He couldn’t have foreseen exactly where our fractured politics, information tsunami, and adversarial approach to life, in general, would land us. Yet he did anticipate the principle-lite fickleness at the heart of today’s post-truth society: The “logical mind …” he writes, “wishes to give its exclusive allegiance to” one diametrically opposed position or the other, “and since this exclusiveness inevitably leads to an ever more obvious loss of realism and truth, the mind suddenly changes sides, often without even noticing it.”

 

In Schumacher’s eyes, educational priorities were equally skewed. Rather than aiming to cultivate the individual, modern school systems were designed to graduate products to serve the cogs-in-a-wheel demands of commerce and industry. The result, Schumacher argues in “Guide,” are structures that can gin up artificial demand for problem-solving while more existentially important puzzles go begging. “Our anxiety to solve problems,” Schumacher says in “Guide,” “stems from our lack of self-knowledge, which has created [a] kind of intellectual anguish … [that] has led to a virtually total concentration of intellectual effort on the study of convergent problems.”

 

Says Schumacher: “Great pride is taken in … ‘the art of the soluble’”—or perhaps today, we’d be tempted to say, in the art of the deal.

 

Wrestling with “the real problems of life,” as Schumacher calls them, is what deserves our highest-order attention. Human life “… can thus be … understood as a succession of divergent problems … They are refractory to mere logic and discursive reason, and constitute, so to speak a strain-and-stretch apparatus to develop the Whole Man, and that means to develop man’s supralogical faculties. All traditional cultures have seen life as a school and have recognized, in one way or the another, the essentiality of this teaching force.”

 

Nothing better underscores Schumacher’s approach to education writ large than his obvious fondness for this quote from Thomas Aquinas: “The slenderest knowledge that may be obtained of the highest things is more desirable than the most certain knowledge obtained of lesser things.” Humility and patience in the pursuit of “the highest things” are keys to unlocking divergent problems.

 

And so, the question: How do we actually tap our “supralogical” faculties? Schumacher’s answers can, at times, come off as gauzy, yet ever upward points the way. He recommends reconnecting with the world’s great philosophical and religious traditions, and looking for lessons in its great art and literature. (Ever the economist, he knows a trove of underutilized resources when he sees it.) What really matters, Schumacher writes, “is whether a person rises to his highest potentialities or falls away from them.” 

 

Critics were quick to spot divergence in Schumacher’s theories. “I kept feeling that something urgent was being said about how the reductionist logic of modern science has indeed misled us and is useless when it comes to the most perplexing questions we face,” Harvard theologian Harvey Cox wrote in The New York Times when “Guide” came out in 1977, “… but the firepower [Schumacher] has concentrated [on his argument] is so mixed and so massive that his original point frequently gets lost.”

 

It is hard to see how reading “The Bhagavad Gita” or practicing Zen meditation might pull us out of our current national funk—or help us create a scalable meeting of minds required, say, to bring Second Amendment diehards together with staunch gun-control activists to thwart our epidemic of public shootings.

 

Yes, folks, times are hard. Too many citizens have given up on pursuit of higher truths in a time when “alternative facts” can make the real thing seem fuzzy, technology is pulverizing privacy, and people feel increasingly powerless in the face of big externalities. Who knows, the growing enthusiasm among Millennials and Gen Z-ers (estimated to make up 37 percent of voters in 2020) for people-oriented policies may once again thrust E.F. Schumacher and his theories into the limelight.

 

In the meantime, if a sage of the seventies has anything to teach us today, it may lie in his chronically appealing idea that optimism about the human spirit, and faith in our potential for problem-solving, can flourish even in the toughest of times.

 

 

For more by the author, check out his latest book: 

 

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172397 https://historynewsnetwork.org/article/172397 0
The Overlooked Story of “the Greater United States”: Historian Daniel Immerwahr Shares His Unique Perspective on American Empire

 

The history of the United States is the history of empire.

Daniel Immerwahr, How to Hide an Empire

 

Although most Americans are familiar with the “logo map” of 48 contiguous states, the story of empire and America’s possessions beyond the North American continent has been largely overlooked. 

 

In his groundbreaking new book, How to Hide an Empire: The History of the Greater United States (Farrar, Straus and Giroux), Professor Daniel Immerwahr offers a corrective that fills in the history of American empire from the perspective of those who live in US territories and other possessions. His lively book is based on extensive archival research, interviews, and other previously overlooked resources.

 

As Professor Immerwahr writes, the founders were ambivalent about rapid expansion, but within a few decades of the birth of the nation, the inexorable and often brutal movement west had captured a continent from the Atlantic Ocean to the Pacific, a place for 48 contiguous states.  Then the country looked beyond the mainland, and How to Hide and Empire offers a fresh point of view that casts our national past in a new light. 

 

As Professor Immerwahr vividly recounts, the offshore history begins when the U.S. took possession of dozens of uninhabited islands in the mid-nineteenth century for coveted guano—bird dung—to fill a need for natural fertilizer on the mainland. By the turn of the century, the U.S. expanded farther with the possession of the Philippines, Puerto Rico, Guam, the U.S. Virgin Islands, American Samoa, Hawai‘i, and more. Today, many former American colonies have been replaced by a “pointillist empire” of more than 800 American military bases and facilities around the globe.

 

From the Guano Islands to America’s widespread empire now, Professor Immerwahr shares neglected stories from the past showing how inhabitants of America’s possessions have been relegated "to the shadows," and, at various times, have been "shot, shelled, starved, interned, dispossessed, tortured, and experimented on."

 

The example of the Philippines is particularly grim, as Professor Immerwahr reminds us. In the bloody yet mostly forgotten Philippine-American War (1899-1902), US troops interned civilians in prison camps, tortured prisoners, and killed thousands of Filipinos—mostly civilians. After “winning” this rich archipelago, the U.S. governed the Philippines as a colony. Fast forward to the Second World War and, all told, the fighting in the Philippines was the bloodiest event ever on US soil. It’s estimated that 1.6 million died in the war, mostly Filipinos. The Philippines became independent in 1946—as the US began to distance itself from colonialism after the war.

 

Now, residents of territories have only limited rights, and their status as US citizens has been misunderstood by many mainlanders, including the current president. And, as Professor Immerwahr stresses, the cruel history of racism and white supremacy looms large in this story of empire, as the inhabitants of our possessions have been treated as inferior and unworthy of full protection as citizens based on race and ethnicity.

 

Professor Immerwahr teaches at Northwestern University, specializing in twentieth-century U.S. history within a global context. His first book, Thinking Small: The United States and the Lure of Community Development, won the Organization of American Historians' Merle Curti Award. His articles have appeared in Slate, Modern Intellectual History, Dissent, and Jacobin, among others. 

 

In an email exchange, Professor Immerwahr generously responded to questions on his new book and his work as a historian.

 

Robin Lindley: Thank you Professor Immerwahr for agreeing to discuss your groundbreaking new book, How to Hide an Empire. Before going to your book, could you describe how you decided to study history?

 

Professor Daniel Immerwahr: I went to Columbia University thinking I would major in music. Taking classes with Anders Stephanson, Betsy Blackmar, and Eric Foner showed me that there was a far better way to not make very much money. 

 

Robin Lindley: What inspired your new book on the history of “the Greater United States,” the US and its possessions? Did it grow out of your previous research?

 

Professor Daniel Immerwahr: For my first book, Thinking Small, I visited archives in Manila. I’d known, of course, that the Philippines had been a U.S. colony. But somehow being there made it click for me. It was like the difference between reading the lyrics and hearing the music. The colonial imprint is hard to miss in Manila, and I came back to California eager to read up on U.S. colonial history.

 

Robin Lindley: How would you briefly describe the historical problem you tackle in your new book?

 

Professor Daniel Immerwahr: When most people think of the United States, the country as they mentally map has a familiar shape: the contiguous blob, with oceans on either side, Canada above, and Mexico below. But that shape only accurately captures the borders for three years of U.S. history (1854–57, since you asked). That’s partly because the United States started off smaller—we talk about that a lot. But it’s also because in 1857 it started expanding overseas. So, the challenge is to write U.S. history with the understanding that the contiguous blob is only part of the country. My book tries to offer a history of all the land under U.S. jurisdiction, of what some around 1900 called “the Greater United States.”

 

Robin Lindley: Your book offers an original perspective on a history that is generally ignored about empire and America’s “possessions.” You cover this wide history from the inception of our nation to the present and often in far-flung places from the point of view of inhabitants who are brought under US control. What was the research process for your ambitious book?

 

Professor Daniel Immerwahr: I had help. There are some parts of the book that come from my own archival research. But in a lot of parts, I leaned heavily on the work of my colleagues. As I say in the book, my main contribution isn’t archival. It’s perspectival.

 

Robin Lindley: Why do you think Americans generally know so little about the history of US territories and other possessions?

 

Professor Daniel Immerwahr: First, many people understand the United States, born of an anti-imperial revolt, as having an allergy to colonial empire. Second, I think the experience of settler colonialism left a mark on the country. Expansion, indigenous dispossession, and frontier wars fit easily into the national mythology. But the conquest of populous overseas territories? Not so much. That seemed to violate people’s sense of what the United States is. And so, many have dealt with that cognitive dissonance by simply not thinking very much about the overseas parts of the country. 

 

Robin Lindley: Your book offers a fresh perspective for most readers. You state that the history of American empire has been “persistently ignored.” Some historians have bristled at this contention, noting that historians have been studying the history of US empire for years. How do you respond to this reaction to your work?

 

Professor Daniel Immerwahr: I would agree. When I write that the overseas parts of the country have been “persistently ignored,” I mean by mainlanders, and I document that fact. The good news is that scholars—many writing from the sites of empire—have been telling the story of U.S. Empire for decades. A book like mine would have been impossible without their work, as my notes show. 

 

What’s exasperating is that, despite all of this high-quality research, it’s still too easy for teachers, students, and even U.S. historians on the mainland to talk as if the United States were merely a collection of states.

 

Robin Lindley: The history you share also provides yet another aspect of American race relations and racism. Citizens of territories were often seen as inferior and less worthy of justice under law than mainlanders. What should readers know about how race shaped the Greater US history?

 

Professor Daniel Immerwahr: They should know that racism didn’t only shape people’s lives within the country. It also shaped the country itself, determining the placement of the borders and, within those borders, which places would count as “American” and which as “foreign.” There’s a long history of U.S. leaders seeking to control which people are “in” and which are “out” of the country. Unfortunately, they’ve largely succeeded in writing Puerto Ricans, Filipinos, and Hawaiians out of U.S. history.

 

Robin Lindley: Your history begins with the ambivalence of some of the founders about “empire” and then the relentless westward expansion on the continent. This story is probably well known in its outlines as the US acquired regions through purchase and violence. What’s your take on this history of continental expansion as you set forth your larger story?

 

Professor Daniel Immerwahr: Westward expansion is a well-known story, as you say. But looking at it from the perspective of territorial empire, you see new facets. 

 

I got really interested in “Indian Country,” the federally delineated all-Indian zone established in the 1830s to contain removed Native Americans and Western groups who still held title over their land. At its start, it comprised 46 percent of the country’s area! 

 

I also found myself being much more attentive to the state/territory division, and to the place of territories within the federal system. We don’t always think about them in this way, but by many measures the Western territories looked more like colonies than embryonic states.

 

Robin Lindley: Your book is wide-ranging and illuminating, I wanted to ask about a few specific episodes, some of which surprised me. It’s fascinating that the off-mainland expansion began with a need for guano, or bird dung, in the 19th century. How did the US address this problem and why did the efforts eventually lead to an insurrection by workers?

 

Professor Daniel Immerwahr: It sounds bizarre, but seabird droppings were one of the most effective natural fertilizers around. And that mattered in the nineteenth century, when the transition to industrial agriculture left many eastern farms parched of nutrients. 

 

It was in search of guano that the United States started annexing islands overseas—ultimately nearly a hundred of them in the Pacific and Caribbean. These were uninhabited, but someone needed to be there to mine the guano. Guano companies came to rely on non-white laborers, essentially marooning them on these rainless, godforsaken islands with instructions to pick, shovel, and blast loose as much guano as possible. Unsurprisingly, guano workers mutinied. One such uprising, on Navassa Island in 1889, led to the killing of five white overseers and, ultimately, a Supreme Court case. It was where the Court first considered whether overseas expansion was consistent with the Constitution. It ruled that it was, thus laying the legal foundation for empire.

 

Robin Lindley: Many know the story of Seward’s Folly, the US purchase of Alaska from the Russians after the Civil War. However, the story of Hawai‘i is less well known. How did the US come to “possess” Hawai‘i?

 

Professor Daniel Immerwahr: For years, racists in Congress had resisted major overseas expansion, on the grounds that it would incorporate too many nonwhite people into the country. The war with Spain in 1898 broke that logjam, as imperialists—also racist, but happy to see Washington rule over distant subjects—ginned up enthusiasm for overseas empire. That war netted the United States the Philippines, Puerto Rico, and Guam. On something of an imperial splurge, Congress decided to annex Hawai‘i and American Samoa, too. It’s important to recognize that this was over the protests of Native Hawaiians. The historian Noenoe Silva has established that more than 38,000 of them signed anti-annexation petitions.

 

Robin Lindley: When I was a student, years ago, we learned little of the Spanish-American War except for the infamous sinking of the battleship USS Maine and the triumphalist stories of Teddy Roosevelt’s San Juan Hill charge in Cuba and Admiral Dewey’s victory at Manila Bay. It seems that the brutal Philippine-American War that followed was ignored—including the atrocities and violence by American troops that killed thousands of Filipinos and displaced many more. The US acquired the Philippines as a colony. What would you like readers to remember about our “winning” of the Philippines and the bloody aftermath?

 

Professor Daniel Immerwahr: I’d like them to grasp its magnitude. We think it claimed 775,000 lives, mostly from the diseases that it set loose. That makes it bloodier than the Civil War! And it lasted a long time. By the time the war in the north, which is what most people talk about, was fizzling out, the war in the south opened, and gave rise to some of the largest massacres in U.S. history. The south wasn’t put under civilian rule until 1913, the fourteenth year of the war overall. Only the Afghanistan War has lasted longer.

 

Robin Lindley: You also note that, in his December 8, 1941 address in response to the attack of the Japanese Empire on Pearl Harbor, President Franklin Roosevelt purposely omitted mention of the attack on American bases in the Philippines that occurred a few hours after the Hawai‘i raid. What’s your sense of FDR’s omission in his historic address and the US view of the Philippines in 1941? 

 

Professor Daniel Immerwahr: Richard Nixon called Pearl Harbor the “only piece of American territory that suffered directly from enemy attack in World War II,” and my guess is that most people would agree with that judgment. But it’s wrong. At the same time as Japan was attacking Pearl Harbor, it was moving on the Philippines, Guam, and Wake Island, all of which it soon conquered. 

 

The first draft of FDR’s speech made it abundantly clear that the Japanese had attacked Hawai‘i andthe Philippines. But then he edited it, cutting prominent references to the Philippines. We don’t know why, but my strong suspicion is that he worried that the Philippines, which had a smaller white population and was further away, wouldn’t readily serve as a casus belli. Certainly, opinion polls at the time showed that mainlanders cared less about defending the Philippines. 

 

Robin Lindley: And you mention that the US interned Japanese and Japanese-American nationals in the Philippines before the internment of Japanese Americans in the US. That ended with the fall of the Philippines to the Japanese, but what happened with this early internment effort?

 

Professor Daniel Immerwahr: It kills me that we never talk about this when telling the story of Japanese internment. Immediately after Japan attacked, Douglas MacArthur ordered police to round up the 30,000 people of Japanese ancestry living in the Philippines. It was brutal. I found accounts of rapes of Japanese women by civilians and soldiers. Accounts of Filipinos who hid Japanese friends in their homes, and who got punished for it. In Davao, guards in the internment camps repeatedly shot random prisoners. It ended not with a law but with Japan’s ground invasion. When Japanese forces took the camps, they freed the prisoners, who then in some instances locked up their former guards. 

 

Robin Lindley: I was surprised by the use of Puerto Rico as a medical laboratory where the citizens were seen as guinea pigs. What happened there? Were other territorial sites for medical and scientific experiments?

 

Professor Daniel Immerwahr: There’s a long and painful history of the territories serving as laboratories. In my book, I tell how doctors, lawyers, and architects found that, in the territories, they could try out new ideas with little effective resistance and with a great deal of impunity. 

 

Although you can find this sort of thing happening in all the territories, doctors have time and again experimented on Puerto Rico. The most notorious is the story of Cornelius Rhoads, a Harvard-trained doctor sent to treat hookworm. He intentionally withheld treatment from some patients, he tried to induce disease in others, and he wrote a letter to a Boston colleague saying that he’d murdered eight of his patients outright. You’d think that this would get him fired and imprisoned. But even though the story came out, he was barely dinged—what happens in San Juan stays in San Juan. He went on to make the cover of Time magazine as a medical hero, one of the inventors of chemotherapy.  

 

Robin Lindley: How do you think the negligent US response to Puerto Rico after Hurricane Maria in 2017 figures in this history? It seems the current US president did not then understand that he was also the president of Puerto Rico.

 

Professor Daniel Immerwahr: Trump has a discomfiting way of addressing Puerto Rico in the second person. “I hate to tell you, Puerto Rico, but you’ve thrown our budget out of whack,” he said after the hurricane—note the pronoun use. He’s operating with an unconcealed sense of a here and a there, an us and a them. There have been other examples of members of his administration referring to the overseas parts of the country as if they were foreign territory. I wish I could say this was particular to Trump, but I don’t think it is. Even Woodrow Wilson, a far more thoughtful president, described the territories as lying “outside the charmed circle of our own national life.” 

 

Robin Lindley: How has the status of territories and other US possessions evolved since the Second World War? 

 

Professor Daniel Immerwahr: The United States sought to distance itself from colonial empire after the Second World War, in part due to international pressure. It granted the Philippines independence in 1946 and made states of Hawai‘i and Alaska in 1959. Puerto Rico, the remaining member of the big four, became a “commonwealth” in 1952. That didn’t change Congress’s ultimate power over the island, but it was nevertheless enough for the United Nations to strike Puerto Rico from its list of “non-self-governing territories.” All of that was a concession to decolonization, but it’s important to note that the United States never fully shed its empire. It still has five inhabited territories—Puerto Rico, Guam, American Samoa, the U.S. Virgin Islands, and the Northern Marianas—and millions of people live in them. 

 

Robin Lindley: With the change in status of territories in recent decades, the US is described as a “pointillist empire” of military bases and US facilities around the world. What is the US empire today?

 

Professor Daniel Immerwahr: The United States might not go in for colonial empire as it once did, but it has not given up claiming foreign land. It’s just that now the land Washington cares most about are not populated colonies but small enclaves. The United States has hundreds of foreign bases—David Vine has estimated 800. It’s not a lot of area in all. If you took all the land that the U.S. controls outside of the states and DC, you’d get an acreage less than Connecticut. But those hundreds of points, strewn around the globe, matter a lot. They matter to Washington and they matter to the many countries that host or are threatened by the U.S. basing structure. 

 

Robin Lindley: What do you see in terms of US empire in the age of Trump? How will politics now affect the future of the territories you studied?

 

Professor Daniel Immerwahr: Trump, as usual, says the quiet part out loud. In so doing, he’s shone a spotlight on the subordinated position of the territories and on the racism that subordinates them. The issue is far more visible today than it was a decade ago. Will that lead to status changes or reforms in rights and representation? I won’t hazard a guess. 

 

Robin Lindley: What’s your next project? Will you continue research on the Greater United States?

 

Professor Daniel Immerwahr: Once you see the underlying geography of the United States as the Greater United States rather than the mainland, it’s hard to unsee it. I’m sure I’ll do more research. But right I’m now working on a book about nineteenth-century environmental catastrophes and a series of studies about the pop culture of U.S. post-1945 global hegemony.  

 

Robin Lindley: Would you like to add anything about your book or your work for readers?

 

Professor Daniel Immerwahr: Don’t underestimate the historical importance of bird poop.

 

Robin Lindley: That’s an excellent point. Congratulations on your original new history Professor Immerwahr, and thank you for sharing your thoughtful remarks. 

 

 

 

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172401 https://historynewsnetwork.org/article/172401 0
The Declassified History of Hitler's British Traitors

 

For almost 80 years, Britain has told itself – and the world – a powerful story about the country’s heroism during the dark days of World War Two. Newspapers, television and the cinema have portrayed the years between 1939 and 1945 as the country’s finest hours: the spirits of Dunkirk, the Blitz and a nation bonded by the rubrics of ‘Keep Calm and Carry On’, ‘Make Do and Mend’, are repeatedly invoked to create an enduring narrative of brave stoicism.  

 

That narrative is not false.  It is simply not the whole story. Hundreds of once-secret Security Service (MI5) and British government files have now revealed an uncomfortable truth:  that for all the genuine unity and determination of the vast majority of the population to defeat Hitler, there was also a small – but dangerous – sub-stratum which yearned for the day when his troops could goose-step down Whitehall amid an orgy of swastika flags.

 

The files, quietly de-classified and released to the UK National Archives in the years between 2000 and 2018, show that more than seventy British men and women were convicted – mostly in secret trials – of working to help Nazi Germany win the war.  Additionally, hundreds of other British Fascists were interned without trial on detailed evidence that they were spying for, or working on behalf of, Germany.  Collectively, these men and women were part of a little-known Fifth Column Four. Two members were executed, whilst most of the others received lengthy prison sentences or were detained throughout the war. 

 

If these men and women were, for the most part, lone wolves or members of small, localized networks, others were very much more dangerous. The de-classified Security Service files document three plots led by well-connected British Nazi leaders to launch a violent “fascist revolution”.  

 

Undercover MI5 agents penetrated each of these conspiracies.  The reports they filed show that all three of the organisations were involved in espionage, spreading Nazi propaganda, illicit contact with Third Reich officials and, ultimately, plans for a armed coups d’état, which aimed to replace the elected British government with a pro-Nazi puppet regime, just as soon as German troops landed in England.

 

Each of the schemes reached its zenith during the nation’s ‘darkest hour’ - late spring and early summer of 1940 when Britain was bracing itself for invasion.  Their leaders – and many of their followers - belonged to the country’s traditional ruling classes: Parliament, the aristocracy and the military.  And yet, in marked contrast to the fate of the more lowly foot soldiers of British fascism, these well-connected traitors were never brought to trial for their crimes.  Instead they were either quietly interned for the duration of the war or – in the case of many of the most senior plotters – left entirely free.

 

The first conspiracy was led by Archibald Maule Ramsay, a sitting Conservative MP, who wanted “a civil war with shots fired in the streets”, and anticipated being made “Protector” of Scotland by a victorious Hitler.  

 

With the invasion of Britain imminent, Ramsay’s Chief of Staff, Anna Wolkoff and a contact at the American Embassy, Tyler Kent, were caught sending British military and political secrets to Berlin.  The evidence showed that Ramsay and his aristocratic wife were knowingly involved in this conspiracy. MI5 wanted all of them charged, but only Wolkoff and Kent were ultimately tried and convicted; Wolkoff got 10 years in prison, Kent seven. The Director of Public Prosecutions decided that the Ramsays should not face court; instead Ramsay was interned for four years, but he retained his seat in the House of Commons and his MP’s salary – and he was allowed to carry on taking part in parliamentary business throughout. His wife was left entirely free, and continued to run their secret pro-Nazi organisation, The Right Club.

 

The names of its members were kept in a large red leather-bound ledger. Throughout the war MPs made formal requests to the Home Secretary for it to be published.  He refused on the grounds that it was: “not in the public interest [and] I do not propose to give any indication of what names there are, or are not, on this list.” That ledger remained under official wraps for almost 60 years. Only in 2000 was it released to the Wiener Library in London.  Analysis of its contents shows that of the 242 Right Club members listed, 13 were titled aristocrats and 12 were sitting MPs; there were also three members of European Royal Families and at least five senior Army officers. There is no evidence that any were ever arrested.

 

Nor was Ramsay the only British fascist planning to bring about a pro-Nazi revolt. Dr Leigh Francis Howell Wynne Sackville de Montmorency Vaughan-Henry was a celebrated composer, music critic and author; he appeared regularly on the BBC, had been director of music at the theatre institute in Florence and had conducted orchestral performances for the Royal Family. He barely gets a footnote in official histories but Henry had a police and MI5 record as a pro-Nazi Fascist and violent anti-Semite.  He was in regular contact with Nazi officials in Germany, had been entertained by Party leaders Berlin and had made at least one radio broadcast for propaganda chief Josef Goebbels.

 

His coup plot, penetrated in spring 1940, seems to have been the most militarily advanced. Undercover MI5 agents reported that his organisation was divided into 18 cells of 25 members each.  As German troops swept across Europe that May, the cell leaders were told that “Revolution is to take place after the total loss of the Channel ports and defeat on the Western Front, and an effort is to be made to link up with the enemy in Holland.”  Other plans involved “intimidation of certain people by threat, and possible action against their wives and children; bumping off certain people.”  He was also found to be in the process of obtaining thousands of rifles and ammunition.

 

Yet Henry was never charged with any crime. Instead, like Ramsay, he was interned throughout the war and then quietly released.  The same preferential treatment was given to the leader of the third coup plot.  

 

John Beckett, a former Independent labour MP, turned fascist, drew up plans to replace the British Government with a Quisling-style “Coalition Government of National Security”.  The names of the ministers in this proposed puppet regime amounted to a roll-call of pro-Nazi British aristocrats and military leaders.  Other than Beckett and his closest aides, none were ever arrested, much less brought to trial.

 

What makes all of these conspiracies even more remarkable is their complete absence – as well as the seventy trials and convictions of fascist traitors - from the official accounts of Britain during World War Two. They form part of a secret history, one which was suppressed in the seven decades afterwards. The delay in de-classifying the official files that document these plots and crimes has never been explained, and the noticeably low-key nature of their release to the National Archives has ensured they largely escaped scrutiny.   

 

I spent more than a year examining and analysing thousands of individual documents in these files.  Together, they plainly show that throughout World War Two senior and influential figures in the British establishment not only supported Hitler but took active – and illegal – steps to hasten a German victory; and, further, that there is compelling evidence that they were protected from the consequences of their actions by reason of their privileged status in society.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172405 https://historynewsnetwork.org/article/172405 0
How the Specter of Andrew Johnson Haunts Donald Trump

 

A drama of monumental proportions and historic significance is unfolding every day in Washington, DC. There are endless predictions and speculation by politicians, pundits, and the people about whether Donald Trump will join the ranks of Andrew Johnson and Bill Clinton as the only presidents impeached. As a presidential historian, I naturally began to think about the past and how it relates to our present moment. In thinking about Johnson, I was surprised to find that he and Trump are cut from the same cloth. While Trump may revel in comparing himself to Andrew Jackson, it is actually Andrew Johnson – certainly not the most admirable or respected of our presidents – with whom he shares six eerie similarities.

 

1. Neither Andrew Johnson nor Donald Trump ever expected to be president. 

Johnson was a politically convenient vice-presidential candidate for Abraham Lincoln as Lincoln sought a second term in 1864. Johnson was a staunch Union loyalist, a Democrat from Tennessee, which had left the Union in 1861. The presence of a Democrat on the ticket bolstered Lincoln’s campaign message of national unity. It was a politically strategic move for Lincoln to help him win another four years as president. But Johnson was just the vice-president – “the most insignificant office that ever the invention of man contrived or his imagination conceived,” to use the words of a frustrated John Adams.

 

Certainly, vice-presidents John Tyler and Millard Fillmore had ascended to the presidency due to the death in office of William Henry Harrison and Zachary Taylor, respectively, but at 56, Lincoln was relatively healthy, despite having significantly aged because of the stress of the Civil War. Surely, he would complete a second term if elected. No president had ever been assassinated. So, Andrew Johnson, with elective and appointive government positions dating back decades, settled in to warm the vice-presidential chair, never expecting that he would be suddenly thrust into the presidency.

 

Like Johnson, Donald Trump never expected to be president. He ran in 2016 as a publicity and profit-making stunt to promote the Trump brand of businesses, and to gear up for launching a Trump media organization. He campaigned against 16 formidable Republicans also vying for the Republican nomination, all of whom had more experience governing than the real estate mogul and reality TV show host. In the general election, Trump was pitted against Hillary Clinton, one of the most experienced candidates ever to seek the presidency. In the end, on the night of November 8, 2016, none of that mattered as a shocked Trump – despite devastating disclosures of mistreatment of women and breaking all precedents of normal behavior – was elected president. 

 

2. Both Johnson and Trump ascended to the presidency in times of great national unrest. 

Johnson presided over the aftermath of the Civil War that had militarily, economically, and racially pitted the north against the south, while Trump strategically re-ignited the bitter flames of the culture wars to win the presidency. His fear-based campaign pitted blue states against red states – Americans against Americans (and the rest of the world). It’s striking that the current blue/red state geography isn’t all that different from the north/south map of the 1860s. And the nation is more divided and polarized today than perhaps at any time since the Civil War.

 

3. Both Johnson and Trump followed presidents from Illinois whose eloquence they failed to match.

Abraham Lincoln and Barack Obama were two of the most articulate, measured, and thoughtful presidents the nation has produced, and both hailed from Illinois. Neither Johnson nor Trump were similarly blessed with such eloquence, but instead exhibited all the tendencies of a demagogue. 

 

In 1866, Harper’s Weekly described President Johnson: “His exhibition of temper, his intemperate, and often indecent, denunciation of his political opponents remind us rather of the demagogue than of the unimpassioned and well-balanced statesman.” When speaking before the “masses,” as Johnson like to call the people, “his speeches are disconnected, full of repetitions, and not even his official position as Chief Magistrate of the United States is sufficient to keep him within the limits of good sense and decorum.”

 

If we didn’t know any better, we could easily imagine that Harper’s Weekly was describing Trump’s fiery rhetoric that easily fits the definition of a demagogue as “a political leader who seeks support by appealing to the desires and prejudices of ordinary people rather than by using rational argument.” Trump’s language is not reasoned, measured, or eloquent. He repeats simple words – seemingly at a loss for anything more articulate - that are designed only to spark raw and angry emotions and not engage minds.

 

4. Temperamentally, both Johnson and Trump could be described as belligerent.

As the sun set on the first day of 1849, President James K. Polk confided to his diary his feelings about then congressman Andrew Johnson. Earlier that day, he had seen the Tennessee legislator in the crowd at the Executive Mansion reception but hadn’t spoken to him. Nevertheless, after less than five years in Washington, Johnson’s reputation was well-known, and Polk wrote that the 41-year-old congressman was “very vindictive and perverse in his temper and conduct.”

 

Similarly, Trump has upended all common decency by viciously attacking anyone who takes a position that threatens his world view or who makes any negative comments about him. He is, in his own words, a “counterpuncher,” who reacts rather than reasons. He has ridiculed foreign allies and leaders, dubbed the media as “the enemy of the people,” stirred up the passions of his base against those who look, speak, and worship differently than the white America of the 1950s, berated his own attorney general as “disgraceful” and “mentally retarded,” claimed his former secretary of state was “dumb as a rock,” and brought the world to the brink of nuclear war by blasting North Korea’s leader as “Little Rocket Man.”

 

5. Johnson and Trump share a common aversion to facts.

 

Both men kept their distance from truth, lest somehow reality get in the way of their desired political narrative. In May 1868, General Ulysses S. Grant sat down to chat with a Republican senator. The conversation turned to the esteemed general’s opinion about the political question that was consuming Washington: should the president be convicted of impeachment? Grant bluntly lobbied the senator, John B. Henderson of Missouri: “I would impeach him if for nothing else than because he is such an infernal liar.” Henderson ultimately disagreed and was one of 19 votes to acquit the president and keep him in office. 

 

If Grant considered that Johnson was an “infernal liar,” what would he think about Trump? It is probably no exaggeration to say if the ghost of the Civil War general found himself suddenly transported to Washington today, he would quickly and vocally assert that Trump is also an “infernal liar.” Trump makes liberal and shameless use of hyperbole, grounded only in the narrative he wants to believe in the moment, regardless of any external realities. According to the Washington Post, Trump has uttered or tweeted more than 10,000 false and misleading claims (lies) since his fact-free presidency began, undoubtedly dwarfing Andrew Johnson’s record.

 

6. The specter of Johnson’s impeachment threatens to envelop the 45th president as well.

On the 1,046th day of Andrew Johnson’s hapless presidency, the House of Representatives voted to impeach him – forever dubiously enshrining his legacy. In less than six months (or December 1, 2019), the similarly bombastic Donald Trump will have been in office for 1,046 days. Does the president get cold sweats as he hears Johnson’s footsteps following him in the White House – fearful that what most defined Johnson’s presidency will also define his?

 

Johnson, of course, was the first president to be impeached and yet managed to survive being evicted from the presidency by one vote in the Senate. At this point, Trump’s margin of senatorial support of loyal Republicans is probably much larger. But if “smoking gun” facts emerge Trump could, in fact, find himself the first president removed from office by Congress. Only time will tell. 

 

The actions that led to Johnson’s impeachment are eerily similar to Trump’s behavior. Johnson incurred the wrath of Congress for defying the institution, specifically for violating the Tenure of Office Act that required congressional approval before dismissing a cabinet official. Trump has also defied Congress by refusing to allow officials or former officials to respond to congressional subpoenas and requests for documents. In both instances, there have been titanic battles between the two branches of government.

 

Both Andrew Johnson and Donald Trump can be characterized as belligerent demagogues lacking eloquence or a commitment to truth. They both unexpectedly became president and in times of great national unrest. Finally, serious discussion of impeachment is haunting Trump. When the history books are written, impeachment may define Trump’s presidency in a similar way that “impeachment” and “Andrew Johnson” are solidly and forever linked together.

 

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172404 https://historynewsnetwork.org/article/172404 0
Hawaii’s Cowboy Heroes

 

At the turn of the 20th century, three Hawaiian cowboys arrived in Wyoming to compete in a rodeo extravaganza. What happened next overturned much of what people thought they knew, and still think they know, about Hawaii, the American West, and the relationship between the two. August 20, 1908: The crowd at Frontier Days in Cheyenne, Wyoming waited excitedly for a cowboy to step into the arena. It was the first of three days of competition at the biggest rodeo in the world. Winning in Cheyenne was the rodeo equivalent of Olympic gold. For weeks the small city at the foothills of the Rockies had been abuzz with talk of three men who traveled from thousands of miles away to vie for the title of champion roper.  The mysterious riders from afar had dark skin, long rawhide lariats, and flowers around the brim of their hats. Ikua Purdy and his cousins, Archie Kaʻauʻa and Jack Low, were paniolo, Hawaiian cowboys. Their forebears had been roping cattle in the islands decades before there was such thing as an “American” cowboy. The early years of the new century marked a tense time in Hawaii. American business interests had orchestrated an overthrow of Hawaii’s monarchy in the late 1800s, followed soon after by forced annexation. For many people throughout the archipelago, these events and this new political reality were not only traumatic, but they also created uncertainty and anxiety. What would incorporation into the United States mean for Hawaiian culture, Hawaii’s future? Within this context, the story of Ikua and his cousins is much more than a rodeo matchup. These Hawaiians were accidental ambassadors of their people. Meanwhile, frontier towns like Cheyenne were undergoing their own dramatic changes. The boom years of the great cattle drives that had made Cheyenne—for a brief window of time—one of the richest towns in the world, had ended with the arrival of homesteaders and barbed-wire fences that partitioned the vast rangelands. Civilization was surging into the future, with the advent of technologies like cars, airplanes, and moving pictures. Yet life in Cheyenne remained hard, and even harder for the Native Americans who had been forced from their ancestral lands. At the same time, residents of Wyoming watched as “the frontier” was being mythologized and turned into entertainment before their eyes. Dime novels, the blockbuster popularity of Buffalo Bill Cody’s Wild West Shows, and rodeo competitions like Frontier Days fueled a fascination with cowboy culture that was sweeping the nation. Aware of this change, people in places like Cheyenne took steps to try and preserve their heritage and profit from outsiders’ curiosity about it. Frontier Days was billed as the “Grand Pageant of the West That Was,” and it served as a stage to revel in this history—at least a white man’s version of it. Locals wanted Frontier Days to push back against this narrative of the taming of the Wild West. Audiences could rest assured that this show, at least, offered “no cheap imitations or sugar coating to sweeten the tongues of the tenderfoot.” When it came to sporting competition, every cattle roping title since Frontier Days started in 1897 had been won by a Wyoming cowboy. Newspaper coverage of the Hawaiians was a mix of curiosity and thinly veiled condescension. Many people at the time didn’t even know that the Hawaiian Islands had been absorbed into the United States. Whoever these interlopers were, no one really expected them to be actual contenders.  The stands and bleachers of Frontier Park were packed, and cars, carriages, and riders on horseback crowded against the fence. Nobody wanted to miss seeing the “dusky wizards of the rawhide,” as the papers called them. Without giving away the ending, suffice it to say that in 1908, the steer-roping championship left Wyoming. Upon their return to Hawaii, the paniolo were greeted as heroes. Everywhere they went they were met with cheering crowds, parades, feasts, speeches, even poetry and hula composed in their honor. Their expert performance in Cheyenne came at an important moment in the history of a nation still that was mourning its stolen sovereignty. As the Hawaiian Star newspaper put it: “[N]ow they have seen a man from the Parker Ranch beat all their champions, they will realize that the Hawaiian Islands are something more than a hula platform in the middle of the Pacific.” It was as if all of Hawaii had just said to Washington D.C. and the white men who run it: You may think you own us, but you don’t really own us.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172402 https://historynewsnetwork.org/article/172402 0
The Case for Religious Progressivism

 

A recent article in The Washington Post dealing with Democratic candidate Pete Buttigieg’s religious beliefs argues that “American progressivism, for all that is good about it, is no more Christian than political conservatism.” Disagreeing with that perspective, the following essay asserts the value of religious progressivism—not as the only valid type of progressivism, but rather as one vital contributor to it. Progressivism should be “a big tent,” welcoming people of all races, spiritual beliefs (including atheistic humanism), classes, and sexual preferences, united by the conviction that free-market capitalism is not a sufficient public philosophy. A higher goal, seeking the common good, must take precedence and oversee our economic system. 

 

“Religious progressivism” here means a progressivism motivated by values that have long been championed by the world’s major religions—love, wisdom, compassion, empathy, humility, patience, prudence, and self-discipline. Other important values (or virtues), such as tolerance, have sometimes been undervalued by those considering themselves religious, but one can argue (as I do here) that it should flow naturally from love and humility. The same can be said for another value, a sense of humor. The leading American religious thinker of the twentieth century, the Protestant Reinhold Niebuhr, believed that the ability to laugh at oneself reflected humility and is “a prelude to faith; and laughter is the beginning of prayer.” 

 

Although progressivism as a political movement did not manifest itself in the USA until the final decade of the nineteenth century, progressive thinking appeared much earlier and was sometimes linked with religious beliefs. One outstanding example was that of Frederick Douglass.

 

Today many who call themselves religiousare critical of progressive positions. After all,  81 percent of white evangelicals supported Trump in the 2016 presidential election. But their Christianity differs profoundly from that of someone like the abolitionist Douglass, whom Barack Obama identified as one of America’s five great reformers who “not only were motivated by faith but repeatedly used religious language to argue their causes.” 

 

In one of his great speeches, given in 1852, Douglass, blasted most U. S. Christian churches for being “not only indifferent to the wrongs of the slave,” but actually taking “sides with the oppressors.” Many of Christianity’s “most eloquent” leaders, he charged, “have shamelessly given the sanction of religion and the Bible to the whole slave system.” Their “horrible blasphemy is palmed off upon the world for Christianity.” For his part, however, he “would say, welcome infidelity! welcome atheism! welcome anything! in preference to the gospel, as preached” by those leaders.

 

Yet, as D. H. Dilbeck has recently demonstrated, this man who said “welcome atheism!” adhered to a Christianity that “shaped his public career,” and was “vital to understanding who he was, how he thought, and what he did.” 

 

By the time of Douglass’s death in 1895, the Progressive Era (1890-1914) had begun. One reliable historical work describes Progressivism as a diverse movement “to limit the socially destructive effects of morally unhindered capitalism, to extract from those [capitalist] markets the tasks they had demonstrably bungled, to counterbalance the markets’ atomizing social effects with a countercalculus of the public weal [well-being].” It did not attempt to overthrow or replace capitalism, but to have government bodies and laws constrain and supplement it in order to insure that it served the public good. 

 

Progressives worked to produce a graduated federal income tax, to reduce corruption in city governments, limit trusts and monopolies, expand public services, and pass laws improving sanitation, education, housing, and workers’ rights and conditions, especially for women and children. Progressive efforts also helped pass pure food and drug laws and create the National Park Service. Some Progressives like Jane Addams, who in 1889 established Chicago’s Hull House to aid the poor, also worked hard to secure the vote for women, which was not achieved in presidential elections until 1920. Following three Republicans in the following dozen years, Franklin Roosevelt renewed the Progressive tradition.

 

Historian Jill Lepore writes, “Much that was vital in Progressivism grew out of Protestantism, and especially out of a movement known as the Social Gospel, adopted by almost all theological liberals and by a large number of theological conservatives, too.” They argued that “fighting inequality produced by industrialism was an obligation of Christians. . . . Social Gospelers brought the zeal of abolitionism to the problem of industrialism.” 

 

Outside of the United States, Christians also occasionally interpreted the message of Jesus in a liberal or socialist manner. Two examples were the French Catholic priest Félicité Lamennais (1782-1854), who broke with his church, and the Russian ecumenical thinker Vladimir Soloviev (1853-1900), a prominent lay theologian and philosopher who allied with Russian liberals against Russian Orthodox conservatives.

 

Back in the United States, the tradition of a radical leftist Christianity continued during the FDR years and beyond with the work of Dorothy Day (1897-1980). A convert to Catholicism, she was praised by Pope Francis in his 2015 Address to the U. S. Congress: “In these times when social concerns are so important, I cannot fail to mention the Servant of God Dorothy Day, who founded the Catholic Worker Movement. Her social activism, her passion for justice and for the cause of the oppressed, were inspired by the Gospel, her faith, and the example of the saints.” Earlier, in his The Audacity of Hope (2006), future President Obama had identified her—along with Douglass, Abraham Lincoln, William Jennings Bryan, and Martin Luther King, Jr.—as one of America’s “great reformers” that was motivated by religious faith. A pacifist who also identified with non-violent anarchism, Day was profoundly influenced by her Catholic beliefs.

 

But her Catholicism was not of the narrow, dogmatic sort but one appreciative of ecumenical outreach. She once wrote that “there is no public figure who has more conformed his life to the life of Jesus Christ than Gandhi”—another twentieth figure whose political philosophy was strongly indebted to spiritual beliefs. Day greatly admired him for advocating spiritual means of resisting violence and injustice. She also wrote of seeing Christ in some anarchists “because they are giving themselves to working for a better social order for the wretched of the earth.”

 

Day was often arrested for protesting. In 1974, she mentioned that she had “been behind bars in police stations, houses of detention, jails and prison farms, whatsoever they are called, eleven times.” Her last arrest, in 1973, was for “unlawful assembly,” in the midst of picketing in behalf of the itinerant Mexican workers of the United Farm Workers led by her friend Cesar Chavez, another Catholic whose protests were fueled by his religious beliefs. During the 1960s Day sometimes picketed and spoke in behalf of civil rights, and she greatly admired Martin Luther King, Jr. (MLK), whom she referred to as “a man of the deepest and most profound spiritual insights.” 

 

MLK, of course, was a Baptist minister, who had received his Ph.D. from Boston University’s School of Theology. As a student he had been strongly influenced by the early twentieth-century Social Gospel movement and throughout his life by Gandhi, who King considered “probably the first person in history to lift the love ethic of Jesus above mere interaction between individuals to a powerful effective social force on a large scale.” (Christian liberation theology, developed in Latin America in the mid-20th century, was another influence on MLK.) The Gandhian mass non-violent resistance tactics that he developed were to be “based on the principle of love.” 

 

The whole civil rights movement of the late 1950s and 1960s was spurred on by religious principles. To help lead it, King and others (many black ministers) formed the Southern Christian Leadership Conference (SCLC), and he was its first president. In addition, King’s demonstration marches “brought together people . . . from widely differing church traditions, not only Christians but also Jews and humanists.”

 

In our own day, Barack Obama has been one of the strongest proponents of a religious progressivism. His The Audacity of Hope  contains a long chapter on “Faith.” In it the future president wrote, “Surely, secularists are wrong when they ask believers to leave their religion at the door before entering the public square.” He discussed his own path to religious faith, culminating in his baptism at the Trinity United Church of Chicago.  And he stated that “if we progressives shed some of our own biases, we might recognize the values that both religious and secular people share when it comes to the moral and material direction of our country, and that such values could help lead to “the larger project of American renewal.” After becoming president, Obama continued to make the case for Christians pursuing a progressive approach.  

 

A leader whose tenure began about the same time as Obama’s second term, Pope Francis, also often advocated a religious progressivism on such issues as the flaws of capitalism and environmental responsibility. And like Obama, he has warned against ideological rigidity—“In a 2013 sermon he warned Christians against making their religion into an ideology: “When a Christian becomes a disciple of the ideology, he has lost the faith. . . . But it is a serious illness, this of ideological Christians. . . . [It is too] rigid, moralistic, ethical, but without kindness.”

 

Thus, those who argue, “We should keep religion out of politics,” are being simplistic. One can more properly cite individuals like Frederick Douglass, Dorothy Day, Martin Luther King Jr., Barack Obama, and Pope Francis who counter that the spiritual and moral values commonly advocated by major religions should inspire our politics.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172399 https://historynewsnetwork.org/article/172399 0
Presidential War Powers and Bill Clinton's Battles

 

As I write this, I hold my breath. Will President Trump plunge the nation into an undeclared war with Iran? Will it set off a global cataclysm, as Russia sides with its ally? 

 

Meanwhile, Congress tries to end the President’s support for the Saudi-led slaughter in Yemen. In Syria too, U.S. forces engage in undeclared bloodshed. And, though Congress never declared war on Afghanistan, America’s struggle there nears eighteen years. Many young Americans have never known peace.

 

Early in U.S. history, it was firmly established that Congress made the decision to fight a war. The Constitution assigned that grave decision to the national legislative body so it would not be made often or frivolously, in the manner of Old World kings. Nowadays, the United States wages wars constantly, on the whim of a single person. 

 

Why do presidents now have so much power to go to war when that power is  unconstitutional? In June 1950, President Harry S. Truman sent armed forces to fight in Korea without obtaining the permission of Congress. Few protested. His impulsive decision initiated a bloody, three-year war with the North Koreans and Chinese.

 

In imitation of Truman’s action, all subsequent presidents have engaged in unauthorized military activity. As a result of presidential wars, millions have died, including some 120,000 Americans (my calculation). U.S. forces fight in so many lands today that the tragedy of war has become commonplace and few of us seem concerned. 

 

Why do presidents commit those lawless acts? There are the official reasons, whichmake the headlines and claim free time on TV. Then there is the truth. 

 

In an insightful letter (to Thomas Jefferson), James Madison wrote in 1798: “The constitution supposes, what the History of all Governments demonstrates, that the Executive is the branch of power most interested in war, and most prone to it. It has accordingly with studied care vested the question of war in the Legislature.”

 

The executive decision may hinge on personal or other irrelevant motives. Bill Clinton’s belligerent acts appeared to illustrate that. They also showed pure presidential war-making, in which Congress and law played no direct role.

 

Clinton’s deeds offered lessons for Americans, which should have been learned from the Korean experience. Had Congress promptly punished the high crime of presidential war instead of accepting it docilely, we might have avoided the debacles in Vietnam, Iraq, Afghanistan, Libya, Syria, Yemen, and elsewhere.

 

Below I review chronologically seven of Clinton’s acts of war, ending with his NATO-aided bombing campaign against Yugoslavia,which concluded twenty years ago in June.

 

Iraq

 

Clinton’s first bombing of Baghdad, on June 26, 1993—killing eight civilians—was supposedly punishment for an attempt by Saddam Hussein to kill George H. W. Bush. Kuwaiti police had arrested seventeen men, claimed to find a bomb in a car from Iraq, and said an Iraqi “confessed” to an assassination plot. On the witness stand, he declared he was innocent and signed a paper because police beat him.

 

Seymour Hersh wrote in The New Yorker (Nov. 1, 1993) that Clinton had been mired in controversy over his cautious Bosnia policy and White House staffers advised that “bombing Baghdad would improve Clinton’s political standing at home and his  diplomatic standing in the Middle East.” Past and present intelligence officials told Hersh the acceptance of the Kuwaiti allegation was based on “conflicting and dubious evidence.”

 

Bosnia

 

Amid a civil war among Bosnian Serbs, Croats, and Muslims, came two gory explosions in Sarajevo’s main market, in 1994 and 1995. Supposedly in response to the latter blast, Clinton and NATO promptly launched a heavy bombing campaign against Serbs—without considering the evidence. (It was ambiguous and did not point to any party as culpable, Professors Steven Burg and Paul Shoup wrote in The War in Bosnia-Herzegovina,1999.) Clinton later sent 20,000 U.S. troops to Bosnia to join NATO “peacekeepers.”

 

By showing toughness, he could further his re-election after being called wishy-washy and anti-military. One writer believed that Clinton, in expectation of cheap oil and huge aircraft sales, intentionally advanced Saudis’ desire for an Islamic country in Europe.

 

Iraq again 

 

Clinton bombed Iraqi air defenses—and some civilians—on September 3 and 4, 1996, to make Saddam Hussein “pay a price” for sending troops to Kurdish Iraq. (Hussein said he was quelling strife between factions.) U.S. presidential voting was two months off.

 

 Afghanistan & Sudan

 

The media covered Clinton’s sex scandal heavily. Widely suspected of lying by denying his association with the intern Monica Lewinsky, he was advised to come clean to get the public on his side. On August 17, 1998, in grand jury testimony and a television address, he abandoned months of denial and admitted “inappropriate” contact with her and having misled the public and his own wife. A poll taken immediately after the speech showed that a favorable rating of 60 percent five days earlier had dropped to 40 percent.

 

On August 20 Clinton bombed Afghanistan and the Sudan. The news upstaged the Lewinsky scandal. Clinton claimed he was fighting “terrorists.” But it soon transpired that one of his supposed terrorist targets was the Sudan’s only medicinal factory, indicating haste in planning the raids.

 

Two senators and two representatives questioned Clinton’s timing and credibility, and the Los Angeles Times asked whether the movieWag the Dog had come to life. In the movie, a Hollywood producer was hired to fabricate a war to distract the public from a presidential sex scandal. But Clinton’s acts of war were real.

 

Iraq once more

 

In early December 1998, the biggest news concerned impending congressional proceedings on the impeachment of Clinton. The question was scheduled for House floor debate on Thursday, the 17th. Voting appeared likely the next day.

 

On Wednesday, the 16th, Clinton again bombed Iraq, falsely claiming it was not cooperating with UN inspectors. Consequently the House postponed the impeachment matter for a day and Iraq took over the headlines. Killing a couple of hundred Iraqis, the bombings continued until impeachment was voted December 19.

 

Yugoslavia

 

For three months, peace talks went on in Rambouillet, France, over strife between Yugoslavia and ethnic Albanians in the Serbian province of Kosovo. Other nations, including the U.S., participated.

 

What brought matters to a head, in March 1999, perhaps had less to do with European troubles than with two news stories troubling Bill Clinton. One dealt with an Arkansas woman’s allegation that he raped her twenty-one years earlier when he was attorney general of Arkansas. Another concerned allegations in the Republican Congress of Chinese theft of U.S. nuclear weapons secrets and inaction by Clinton, alleged recipient of campaign donations from China. A House committee had prepared a classified report on the matter and a Senate panel planned an investigation.

 

In Belgrade, Yugoslavia, Clinton’s envoy, Richard Holbrooke, delivered an ultimatum to the president, Slobodan Milosevic. To avoid war, the latter had to sign an agreement letting NATO troops occupy all Yugoslavia, then comprising Serbia and Montenegro. A day or two later, on March 23, Holbrooke forwarded the go-ahead for war to NATO’s secretary general in Brussels.

 

The attack came March 24, wiping the allegations about Clinton off the TV news and newspaper front pages. U.S. and other NATO forces spent the next eleven weeks hitting Yugoslavs with air-launched missiles, bombs, and bullets. Hillary Clinton may have influenced Bill’s decision. On March 21, when he was undecided about attacking Yugoslavia, she phoned and “urged him to bomb” (as quoted by biographer Gail Sheehy in Hillary’s Choice, p. 345).

 

Yugoslav officials placed civilian bombing casualties at 2,000 killed, 10,000 wounded. Official estimates of civilian war deaths from all causes went as high as 18,000. Eighteen U.S. deaths were reported (Wikipedia). Why the mass killing by the U.S.-NATO forces? Bill Clinton said they were to stop mass killing in Kosovo, which had been going on for a long time. But if they were such an old story, why did he choose the time he did to start a war? Could this attack and the previous three attacks all have served as distractions from scandal?

 

President Clinton not only usurped Congress’s authority under the Constitution to decide whether to go to war (Article I, Section 8) but continued bombing  even after two rebuffs by the House of Representatives on April 28, 1999: a 427–2 vote against declaring war and a 213 tie vote rejecting the bombing.

 

Abundant writings of the nation’s founders confirm the congressional war power. Take Alexander Hamilton: “The president is to be commander-in-chief of the army and navy.… It would amount to nothing more than … first General and Admiral …” (The Federalist, 69, 1788). “… It is the peculiar and exclusive province of Congress, when the nation is at peace, to change that state into a state of war …” (“Lucius Crassus” 1, 1801). And Thomas Jefferson: “Congress alone is constitutionally invested with the power of changing our condition from peace to war …” (message to Congress, 1805).

 

But when the chief executive ignores Congress, what then? That official is entrusted with armed force all the way up to hydrogen bombs. The lawmakers’ strongest weapon is impeachment by the House of Representatives (the bringing of charges) and trial by the Senate (to decide if removal from office is warranted). 

 

Only two presidents have been impeached, neither removed: Andrew Johnson (1868) and Clinton (1998). The charges against the latter were not illegal war-making and numerous counts of homicide, but perjury and obstruction of justice. The Senate acquitted him.

 

See www.warandlaw.org/files/ClintonsWar.htm for a more detailed account of the Yugoslav war, including the massacre allegations, the media’s reporting, and the role of international law.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172398 https://historynewsnetwork.org/article/172398 0
What Historians Are Saying: 2020 Election Democratic Primary Debates Click inside the image below and scroll down to see articles and Tweets. 

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172385 https://historynewsnetwork.org/article/172385 0
Stonewall's Legacy and Kwame Anthony Appiah's Misuse of History

 

In a recent op-ed in the New York Times, “Stonewall and the Myth of Self-Deliverance (June 22, 2019) Kwame Anthony Appiah, a distinguished philosopher at Columbia University, tries to debunk what he considers a myth: that the 1969 Stonewall Rebellion led to the takeoff of the Gay Rights movement in the United States. Instead, he credits “black-robed heterosexuals” (judges) who made important legal decisions and “mainstream politicians” who jumped on the gay rights bandwagon decades after Stonewall. In Appiah’s philosophy, long-term abusers who finally recant and mainstream political figures that drew them into coalitions, not the people whose actions created crises and precipitated change, deserve credit for marginalized groups gaining basic human rights.

 

Curiously, much of Appiah’s argument rests on events in Great Britain rather than the United States, events that he does not completely or accurately report. Appiah’s hero is Leo Abse, a backbench Labour Member of Parliament who pushed a private member bill that Appiah claims, “basically decriminalized homosexuality.” The Sexual Offences Act of 1967 revised a 1553 Buggery Act that made male same-sex sexual activity punishable by death and an 1885 law that eliminated the death penalty but maintained criminality. 

 

What Appiah ignores in his claims is the lead up to passage of the 1967 act and the limits of the act itself. In the Cold War climate of the early 1950s, British police were actively enforcing laws prohibiting male homosexuality, partly out of concern with national security and partly as part of rightwing anti-communist crusades. A series of high-profile arrests and show trials, including prominent members of the British elite, led to imprisonment and in the case of Alan Turing, a scientist and World War II hero, enforced chemical castration and suicide. Reaction to the anti-homosexual campaign led to the 1957 Wolfden Report that recommended the decriminalization of homosexuality. However it was not until 1967 when the Labour Party took power that the recommendations were implemented. Abse was credited with the bill so Labour Party leadership could distance themselves from accusations that they were pro-gay.

 

Appiah also missed that the Sexual Offences Act maintained general prohibitions against “buggery” and indecency between men, and only provided for a limited decriminalization when sexual relations took place in the privacy of someone’s home. It was still criminal if men had sexual relations in a hotel or if one of the parties was under age 21. 

 

Appiah is correct that individual acts of defiance, by themselves, do not generate major institutional change. Rosa Parks was not a little old lady who refused to give up her seat on a bus because she was tired. Parks was part of organized resistance to bus segregation by the local NAACP and a coalition of Montgomery’s Black churches.

 

What Appiah misses in his dismissal of the Stonewall Rebellion’s historical importance is that symbols like Rosa Parks sitting down and Stonewall are crucial to social movements as they mobilize and move from the political margins to the center of civic discourse. In the 1850s, the Underground Railroad was both an important symbol for Northern abolitionists demanding an end to slavery because it demonstrated the enslaved Africans desire for freedom whatever the risk and for Southern slaveholders demanding constitutional protection for their “property” rights. John Brown was executed for his role in the 1859 raid on the federal armory in Harpers Ferry. Two years later, United States troops marched into battle against the Confederacy singing that while Brown’s body is “moldering in the grave, his soul’s marching on.”

 

Appiah’s argument is indicative of his larger philosophical outlook. As I read his philosophical work, Appiah tends toward Hegelian idealism, the belief that abstract ideas somehow play an independent and dominant role in shaping history. In New York Times Magazine advice columns his tendency is to recommend a hands-off or non-interventionist approach to personal moral dilemmas. Politically, Appiah places his hopes for change on persuasion and rejects intervention.

 

In his book The Honor Code, How Moral Revolutions Happen (Norton, 2010), Appiah credited British commitment to the idea of “honor” for the end of the trans-Atlantic slave trade and slavery in the British colonial empire. Missing from the book are any references to Toussaint Loverture and Sam Sharpe, leaders of slave rebellions in Haiti and Jamaica that shook the colonial world and were instrumental in ending slavery. Paradoxically, Appiah managed to attribute honor and idealism to 19th century British leaders, the same people who were busy colonizing India and Africa and marketing opium in China, while ignoring famine in Ireland and exploiting and impoverishing its own working class.

 

But Appiah’s argument doesn’t end with how history is written: it’s also about how history is taught. As a teacher and teacher educator I was disturbed by Appiah’s dismissal of the way the “great moral crusade of the 19th century, is now taught in schools.” He cites the “New York State Regents curriculum guide, which shapes public high school education in the state,” especially its reference to “people who took action to abolish slavery” that “names four individuals, all but one of them people of color.”

 

While Appiah is worried about what he considers a misleading high school curriculum, he is actually quoting from the 4th grade New York State Regents standard (4.5) for teaching about slavery. It focuses on the biographies of individuals with connections to New York and includes Samuel Cornish (New York City), Frederick Douglass (Rochester), and Harriet Tubman (Auburn). It also introduces students to William Lloyd Garrison, a white, non-New York abolitionist. The focus on biography may be simplistic, it is fourth grade after all and the children are ten-years old.

 

The high school standards, which like the 4th grade standards are advisory, not mandatory, are very different. They recommend that students “analyze slavery as a deeply established component of the colonial economic system and social structure, indentured servitude vs. slavery, the increased concentration of slaves in the South, and the development of slavery as a racial institution” (11.1); “explore the development of the Constitution, including the major debates and their resolutions, which included compromises over representation, taxation, and slavery” (11.2c); “investigate the development of the abolitionist movement, focusing on Nat Turner’s Rebellion, Sojourner Truth, William Lloyd Garrison (The Liberator), Frederick Douglass (The Autobiography of Frederick Douglass and The North Star), and Harriet Beecher Stowe (Uncle Tom’s Cabin)” (11.3b); and recognize that “Long-standing disputes over States rights and slavery and the secession of Southern states from the Union, sparked by the election of Abraham Lincoln, led to the Civil War. After the issuance of the Emancipation Proclamation, freeing the slaves became a major Union goal” (11.3c). 

 

Standard 11.10b focuses on how “Individuals, diverse groups, and organizations have sought to bring about change in American society through a variety of methods.” It includes “Gay Rights and the LGBT movement (e.g., Stonewall Inn riots [1969]” efforts to achieve equal legal rights. In addition, in the 12th grade civics curriculum (12.G2d), students learn that  “the definition of civil rights has broadened over the course of United States history, and the number of people and groups legally ensured of these rights has also expanded. However, the degree to which rights extend equally and fairly to all (e.g., race, class, gender, sexual orientation) is a continued source of civic contention.” 

 

The New York Times should have done a better job fact-checking Appiah’s essay. Philosophy may be allegorical. History definitely isn’t.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172367 https://historynewsnetwork.org/article/172367 0
Roundup Top 10!  

Which of the F.D.R. Wannabes Actually Understands New Deal Liberalism?

by Jonathan Atler

Suddenly, Franklin D. Roosevelt is all the rage. But many Democrats don’t understand what his legacy means.

 

‘Never again’ means nothing if Holocaust analogies are always off limits

by Danya Ruttenberg

Yes, every situation is different. That doesn’t mean we can’t compare them.

 

 

Whatever Happened to Moral Capitalism?

by Michael Kazin

Let one loyal, if anxious, Democrat offer a solution: “moral capitalism,” a system that, in the words of Congressman Joe Kennedy III of Massachusetts, would be “judged not by how much it produces, but how broadly it empowers, backed by a government unafraid to set the conditions for fair and just markets.”

 

 

The Closure

by Brendan O'Malley

What happens when the college you work for closes? A first-hand account of the state of higher education.

 

 

Black People’s Land Was Stolen

by Andrew W. Kahrl

In addition to invoking the 40 acres black people never got, the reparations movement today should be talking about the approximately 11 million acres black people had but lost, in many cases through fraud, deception and outright theft, much of it taken in the past 50 years.

 

 

Boston’s Black History is American History

by Kevin Levin

We still have a ways to go in reaching beyond the traditional narrative of history in Boston and beyond.

 

 

What South Africa Can Teach Us About Reparation

by Ereshnee Naidu-Silverman

Reparations can work, but only if we start telling the truth about racism and slavery.

 

 

The war on journalism is a war on the humanities

by Monika Eisenhauer

"The intention of historical science is to write the history of people and societies as objectively as possible. For this purpose it needs free and unrestricted access to the sources."

 

 

The Anti-Abortion Politics of White Women

by Jacqueline Mercier Allain

The reality is that women—white women in particular—are among today’s anti-choice movers and shakers. 

 

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172395 https://historynewsnetwork.org/article/172395 0
Racism, Reparations, and the Growing Political Divide Steve Hochstadt is a professor of history emeritus at Illinois College, who blogs for HNN and LAProgressive, and writes about Jewish refugees in Shanghai.

 

 

When I was growing up, white racism was a powerful and ubiquitous force in American life. It was impossible to ignore and nearly impossible to remain untouched, even if one consciously believed that skin color had nothing to do with human worth.

 

The outburst of civil protest about racism in the 1960s was a sign of American optimism: our democracy had severe flaws with deep historical roots, but they could be overcome through peaceful political action. The intransigence of openly racist politicians from the South and covertly racist politicians from everywhere else would yield to massive popular dissent. The Civil Rights Act of 1964, the Voting Rights Act of 1965, and the Fair Housing Act of 1968, among many other legislative victories for racial equality, introduced a new era in American history.

 

It was comforting to believe that over time America would no longer be divided unequally into black and white, that the effects of racism would gradually disappear as legal racism itself became a thing of the past. Viewed from today, that idea appears hopelessly optimistic, the dream of political Pollyannas, who ignored the long history and crude reality of American racism. Every survey and social scientific study demonstrates the continuing power of racism to distort and impoverish the lives of black Americans. What is relatively new is the congruence of the partisan and racial splits.

 

Much has changed for the better, as evidenced, for example, by the ability of black politicians to win races in every state. The ceiling on black political success has been lifted a bit, but not broken. Until 2013, there was never more than one black Senator in office. Although about half of US Senators had first been elected to the House of Representatives, that only works for white politicians: only one black, Republican Tim Scott, has moved from the House to the Senate. There have been only two black governors in our history.

 

Donald Trump has certainly exacerbated racism in America, but the racist attitudes that he plays on never disappeared. While the openly racist public displays of ideological white supremacists have become more common, the much larger undercurrent of racist beliefs has finally found a comfortable home in the Republican Party base.

 

Only 15% of Republicans say that our country has not gone far enough in giving blacks equal rights. The number of Republicans who say that American politics has already gone too far in giving blacks equal rights is twice as large. Three-quarters of Republicans say that a major racial problem is that people see discrimination where it doesn’t exist and that too much attention is paid to racial issues. One out of 5 Republicans say being white hurts people’s ability to get ahead, and one out of 3 say that being black helps. Nearly half of Republicans say that “lack of motivation to work hard” is a major reason why blacks have a hard time getting ahead, and more than half blame “family instability” and “lack of good role models”. One third of Republicans say that racial and ethnic diversity is not good for our country. Among white Republicans who live in the least diverse American communities, 80% wish for their communities to stay the same and 6% want even less diversity. Half of Republicans say that it would bother them to hear a language other than English in a public place.

 

While racist Americans have congregated in the Republican Party, white Democrats appear to be moving away from racial resentment. Between 2014 and 2017, the proportion of white Democrats who said that the “country needs to continue making changes to give blacks equal rights”, has grown from 57% to 80%. A different study offers even stronger numbers. In both 2012 and 2016, about half of Republicans displayed “racially resentful” attitudes toward blacks, and only 3% expressed a positive view. Among Democrats, the proportions shifted: the proportion who were positive about blacks doubled and those who felt most resentful fell by nearly half.

 

Apparently, college-educated whites had long known that the Democratic Party was more likely to be sympathetic to blacks on racial issues, and thus sorted themselves politically according to their own racial attitudes. But less educated whites came to recognize this partisan difference more recently, especially during the Obama presidency, and those with racial resentments who had been Democrats moved to the Republican Party.

 

While white Americans who feel negatively about blacks and believe that too much has been done to redress centuries of discrimination are collecting in the Republican Party, Democrats, both politicians and voters, are openly discussing reparations. 80% of Democrats believe that the legacy of slavery still affects the position of black Americans. The House Judiciary Subcommittee on the Constitution, Civil Rights and Civil Liberties held an unprecedented hearing on slavery reparations last week. Rep. Sheila Jackson Lee of Texas has proposed a bill “To address the fundamental injustice, cruelty, brutality, and inhumanity of slavery ... between 1619 and 1865  and to establish a commission to study and consider a national apology and proposal for reparations”. Such a bill had been stalled in the House for 30 years. Now Speaker Nancy Pelosi says she supports it. The four Democratic Senators who are candidates, Sanders, Booker, Warren and Harris, all co-signed a Senate bill to study reparations. Julian Castro, Kirsten Gillibrand and Beto O’Rourke also support such a study. Amy Klobuchar and Joe Biden have been circumspect, but not dismissive. I couldn’t find any Republican political figure who supports even a study of the issue.

 

Reparations are already being considered: Georgetown University students voted 2 to 1 to impose a $27.20 fee on themselves to compensate the descendants of the 272 slaves sold in the 1830s by Georgetown’s founders.

 

One of the big arguments that Republicans, like Mitch McConnell, have used against even thinking about reparations is that slavery is 150 years in the past. But government discrimination against African Americans deliberately deprived them of financial resources in my lifetime.

 

Returning black veterans could not take advantage of the GI Bill because of racist tactics in North and South. Blacks were excluded from getting home loans in the newly expanding suburbs.

 

Reparations would certainly be difficult to decide upon and to administer. The clean partisan split over whether to consider the issue demonstrates how Republicans and Democrats are moving away from each other. We’ll see whether that strengthens or weakens continuing American racism.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/blog/154224 https://historynewsnetwork.org/blog/154224 0
How the Debate Over the Use of the Term ‘Concentration Camp’ was Amicably Resolved in 1998 When on June 18th, the Jewish Community Council of Greater New York (known locally as the JCRC) addressed an open letter of complaint to Rep. Alexandria Ocasio-Cortez for calling migrant detention centers “concentration camps,” the JCRC was reflecting how emotionally charged this term is for Jews.  In subsequent statements, Ocasio-Cortez made it clear that she was not drawing an analogy to Nazi-era death camps.  The JCRC’s letter compounded what might be considered community-relations malpractice in patronizingly offering “to arrange a visit to a concentration camp, a local Holocaust museum, hear the stories of local survivors, or participate in other educational opportunities in the hopes of better understanding the horrors of the Holocaust.”  

 

But lost in the controversy was a resolution of a parallel dispute in 1998 that redounded to the credit of all concerned.  At that time, Japanese-American organizers were preparing a museum exhibit at Ellis Island entitled “America’s Concentration Camps: Remembering the Japanese American Experience,” on the forced relocation and imprisonment of Japanese Americans by the United States government during World War II.  Instead of criticizing the exhibit’s curators, the American Jewish Committee (AJC) conferred with them and amicably arrived at an arrangement that satisfied both understandable Jewish sensibilities regarding the memory of the Holocaust and the right of other Americans to commemorate the injustice they endured during those very same years. This was explained in their joint press release:

 

An exhibit—entitled America’s Concentration Camps: Remembering the Japanese American Experience—chronicling the shameful treatment of Japanese Americans during World War II, will soon open at the Ellis Island Immigration Museum. Thousands have already seen the exhibit, which was created by and, in 1994, shown at the Japanese American National Museum in Los Angeles. Today, our sights are trained on the importance of such an exhibit in teaching about episodes of intolerance. We strongly urge all who have the opportunity to see the exhibit to do so and to learn its critical lessons.

 

A recent meeting between Japanese American and American Jewish leaders in the American Jewish Committee’s New York City offices led to an agreement that the exhibit’s written materials and publicity include the following explanatory text:

 

“A concentration camp is a place where people are imprisoned not because of any crimes they have committed, but simply because of who they are. Although many groups have been singled out for such persecution throughout history, the term ‘concentration camp’ was first used at the turn of the century in the Spanish-American and Boer Wars.

 

“During World War II, America’s concentration camps were clearly distinguishable from Nazi Germany’s. Nazi camps were places of torture, barbarous medical experiments, and summary executions; some were extermination centers with gas chambers. Six million Jews were slaughtered in the Holocaust. Many others, including Gypsies, Poles, homosexuals, and political dissidents were also victims of the Nazi concentration camps.

 

“In recent years, concentration camps have existed in the former Soviet Union, Cambodia, and Bosnia.

 

“Despite differences, all had one thing in common: the people in power removed a minority group from the general population and the rest of society let it happen.”

 

The meeting and the agreement about the text also reinforced the close and constructive relationship that has long existed between the Japanese American and American Jewish communities. Jewish community groups, especially the American Jewish Committee, were among the first and most vocal outside the Japanese American community calling for the U.S. government to offer an apology and monetary redress for its treatment of Japanese Americans during World War II.

 

In 1988, Congress and President Reagan passed legislation that formally granted the redress and apology to Japanese Americans who were incarcerated. Both communities have been among America’s leading voices advocating for strong civil rights, anti-discrimination and hate crimes laws. The meeting’s participants were encouraged to continue the work of preserving the memories of our communities’ experiences and helping other learn from them.

 

The exhibit represents a precious opportunity for those who must tell its story—Japanese Americans and other victims of tragic intolerance—and for those who must hear it. The story is one of betrayal; betrayal of Japanese Americans, who were deprived of protections that all Americans deserve; and betrayal of the American soul, which is defined by its unique commitment to human rights. The best insurance that we will never again commit such acts of betrayal is to use history of this sort as an object lesson for Americans today and in the future.

 

We know that today’s iteration of this dispute over terminology and history is political in ways that the 1998 episode was not, as exemplified by partisan brawling on the meaning and motives behind Ocasio-Cortez’s words.  Still, it’s good to know that communities and individuals can come to an accord over such a sensitive matter when they exercise prudent judgment.  

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172330 https://historynewsnetwork.org/article/172330 0
10 Things To Check Out At the Library of Congress’s New Exhibit on Women’s Suffrage

1. “Declaration of Sentiments” print

Although the original version is lost, this printed version of Elizabeth Cady Stanton’s “Declaration of Sentiments” has somehow survived 150+ years and now sits in the first display of the exhibit to show where the suffrage movement began to gain steam in the US. In it, Stanton demands moral, political, and economic equality.

 

2. "More to the Movement” placards

These placards, interspersed throughout the exhibit, cast light on minority women suffragists – figures who have usually gone unacknowledged in accounts of US and women’s history. While the Seneca Falls Convention is conventionally seen as the beginning of the suffrage movement, the “More to the Movement” placards note that women’s rights were first considered as an issue in 1837, at the Anti-Slavery Convention of American Women in New York City. 

 

3. Women in Politics video

The end of the exhibit features a video compilation of famous speeches made by women politicians and figures, including Shirley Chisholm, Sandra Day O’Connor, Ileana Ros-Lehtinen, and Hillary Clinton when she became the first woman presidential nominee from a major party. 

 

4. "Music of the Suffrage Movement” display

Tucked into the corner of one of the displays is this screen, which allows visitors to see and listen to the music that inspired suffragists across generations to resist. Many of the songs were anthems composed and written to be sung in large public forums.

 

5. Kentucky House of Representatives Roll Call

Although Kentucky was not the 36thand final state required to ratify the 19th Amendment, this sheet of paper has been preserved as a reminder of the rapid progress of the movement after World War I. The 19thAmendment was officially ratified just months after this roll call was taken.

 

6. Suffragist cap and cape

Displayed in a case along with other 20thcentury artifacts; this white cap and purple gown was worn by members of the National Woman’s Party from 1913 to 1917.

 

7. Abigail Adams letter

Remarkably, this letter from Abigail Adams to her sister Elizabeth Shaw Peabody has survived 220 years. Part of it reads, “I will never consent to have our sex considered in an inferiour point of light. Let each planet shine in their own orbit.”

 

8. “Surviving Prison and Protecting Civil Liberties” display

This display powerfully documents the commitment to the movement’s ideals that many women displayed during World War I. Suffragists protested the war as hypocritical, resulting in prison sentences and, in some cases, torture. Posters and newspaper headlines depicting women with feeding tubes shoved down their throat convey the strength and resolve of these women in the face of horrendous treatment.

 

9. Story of Ida B. Wells-Barnett

Ida B. Wells-Barnett’s story reveals how many women, particularly women of color, faced challenges and obstacles outside of the fight for women’s suffrage. Many African American suffragists were segregated during the March 1913 national suffrage parade, but Wells-Barnett refused. She marched with her state group from Illinois, despite the endorsement from some of the parade’s segregation policy.

 

10. World War I protest photos and artifacts

An entire section of the exhibit is dedicated to women’s protest of World War I as hypocritical, noting that while the US fought for “democracy,” it denied over half of its own citizens the right to vote. It features original images and artifacts from the resistance movement, including a piece of sign pictured in the photo above.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172329 https://historynewsnetwork.org/article/172329 0
The Stonewall Exhibit in New York Needs Sturdier Walls

A paper fan, one of the objects at the exhibit

 

On the morning I went to see the museum exhibit Stonewall 50 at New York Historical Society about discrimination against gays in America, Newark Liberty Airport held its first ever drag queen show in Terminal C, with a huge crowd of cheering onlookers. New Yorkers were celebrating World Pride month. Mayor Pete Buttigieg, of South Bend, Indiana, an openly gay man, was not only running for President of the United States, but comfortably fourth in the polls. The LGBTQ organization at my University is growing and thriving. Could any of these things have happened in the summer of 1969, the summer of the fabled Stonewall riots, that gave birth to the gay movement in America?

 

Of course not.

 

Stonewall was a momentous event and its 50th anniversary is being celebrated in many ways across the country, including this exhibit at the New York Historical Society, W. 77th Street and Central Park West, New York. 

 

The exhibit on Stonewall, though, needs more walls of its own. It also needs a sharper focus and a richer historical backdrop. As it stands, it is a pretty weak Stonewall itself.

 

You read a book on the history of the New York Yankees and it is about …the Yankees. Here, there is an exhibit about Stonewall but there is little Stonewall. There is a small mention of the club and riots, sort of an afterthought, and that’s it.

 

The Stonewall riots of 1969 not only shook the city to its core, but all of the country, too. It was not just an event in gay history, but American history. How can there be nothing about Stonewall in an exhibit about Stonewall? Am I missing something?

 

The infamous Stonewall riots began early in the morning on June 28, 1969 when a team of New York City police raided the Stonewall Inn, a small gay bar on Christopher Street, in New York’s Greenwich Village. The police burst into the bar and arrested numerous customers. The patrons, gays, did not go quietly into the night. They protested, loudly. That brought in more police. That brought in more protestors, more than 600, that shut down numerous streets, and Stonewall was quickly engaged in a riot. Fires were set, police cars nearly overturned, buildings damaged and the neighborhood trashed. The riots continued for six days. Police arrested 21 people. Much of this was captured by TV news crews and the riots, seen nationally, became infamous. They gave the gay community new strength and changed gay history in this country.

 

All of this drama is generally overlooked in the exhibit.

 

The exhibit has numerous problems. First, it is is really small, almost as small as the Stonewall Inn itself. The whole exhibit takes up just two walls and a tiny, tiny room that is really, really badly lit. That’s it. You could walk down the hall and, if you don’t look carefully, you will miss the entire exhibit and wind up in birds of America.

 

This is not really an exhibit about Stonewall, or even the volcanic effects of the Stonewall riots; it’s an exhibit about gay life in New York in the 1950s and 1960s. That’s fine, but the museum should not connect it to Stonewall.

 

There are some very witty posters in the exhibit, such as “No Lips Below the Hips” and very sad ones, such as “Lesbians Don’t Get AIDs, They Just Die from It.”

 

There are all sorts of protest march posters, gay magazines and even 1950s gay paperback novels, about men and women. There is a huge wall of items from the Lesbian Herstory Archives in New York There are some Lesbian “Lavender Menace” T-Shirts sprinkled throughout the exhibit. 

 

“The Stonewall uprising…was a watershed moment in the gay rights movement and we’re proud to honor its legacy…during this 50th anniversary year,” said Dr. Louise Mirrer, President and CEO of New York Historical. “The history of New York’s LGBTQ community is integral to a more general understanding of the long struggle for civil rights on the part of LGBTQ Americans. We hope that with our Stonewall exhibition and displays, our visitors will come to appreciate the critical role played by Stonewall in helping our nation towards a more perfect union.”

 

The exhibit is in three sections. Letting Loose and Fighting Back: LGBTQ Nightlife Before and After Stonewall is about entertainment, particularly in clubs. It is the best of the three, by far. In it, visitors get a really good understanding of how gay clubs had to operate in the 1950s and 1960s as secret entertainment centers, and meeting places. There were dozens of them, usually with non-descript fronts. Inside there was really wild, colorful entertainment by singers and dancers, splashy floor shows and lengthy and loud drag queen extravaganzas. The exhibit highlights these in a splendid display that includes color videos of shows in clubs and marvelous miked conversations by apparently unknowing customers at the club. 

 

The exhibit includes special guide books for newcomers to the city so they could find gay entertainment, club posters, programs, ticket stubs and even the complete architectural plans for The Saint club. Just outside the club room is an exhibit of the gay queen Rollerena, a 1970s Wall Street worker who skated to work on roller skates each day and into the gay clubs each night. There is a tribute to the cross dressing Flawless, an entertainer and club hostess (cross dressing was illegal in the United States for years).  

 

The gay club The Blue Parrot was the centerpiece of the “bird clubs” for gays throughout town.  The very mainstream Hotel Astor had a gay club, the Matchbox.  The 82 Club was where celebrities like Judy Garland and later Liz Taylor hung out.

 

The exhibit explains, too, how city leaders neatly got around small items like the first amendment in policing the clubs. They invented a “disorderly persons” crime that covered all gay activity and permitted tough law enforcement of the venues. Hundreds of people were arrested for being “disorderly.”

 

The most interesting part of the exhibit, presented in this section, is the role the Mafia played in gay clubs. Homosexuals had nowhere else to go, so the Mob bought or controlled many of the gay clubs, including the Stonewall Inn. The Mob made a fortune and used some of them as fronts for Mafia activities. The mob paid off the police not to raid them. There was a slip up over Stonewall in 1969, though, and that miscue, it was said, led to the raid and the subsequent riots.

 

The entire club exhibit has a very authentic you-are-there feel to it. 

 

By the Force of Our Presence offers highlights from the Lesbian Herstory Archives. It includes memorabilia from the life of a lesbian African American woman born in North Carolina who moved to Harlem plus numerous posters.

 

Say It Loud, Out and Proud: Fifty Years of Pride is a wall that contains an enormous timeline of years of gay pride parades and protest and photos of men and women, in wild costumes, who marched in the parades.

 

The exhibit is curated by Rebecca Klassen, Jeanne Gardner Gutierrez and Rachel Corbman.

 

The exhibit offers a nice look at gay life in New York, and America but, still, it needs a little bit of Stonewall because those folks at Stonewall caused a whole lot of trouble, trouble that changed history.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172326 https://historynewsnetwork.org/article/172326 0
Tangled Lives, Tangled Culture, Tangled History

 

It is 1979 and sodomy is illegal in some states and cross-dressing in others. It has been ten years since the Stonewall riots in New York. The gay community has emerged from the closets of the nation but was still trying to find itself. Marvin, a headstrong, good looking young man, is married to his beloved Trina and the father of an adorable ten year old boy, Jason. Life is good, life Is strong, but Marvin has a problem. He is in love with another…a guy.

 

He leaves his wife and son and moves in with his boyfriend. His wife, stunned, falls for and lives with her psychiatrist, Mendel. The ten year old boy is thunderstruck by all this activity and concentrates on playing to playing Little League baseball a bit better to keep calm.

 

Their story is told in a new revival of the 1980s musical Falsettos, that just opened at the Princeton Summer Theater, in Princeton, N.J.

 

This version of Falsettos is rock solid, a super musical with fine acting, sharp direction, edgy songs and a solid, and quite emotional, story to tell.

 

I wanted to see this play about 1979 because all the problems the gay community faced forty years ago remain today, but the solutions are better, the public perception of those problems better and the people who have the problems today do not suffer as their forebears did back in 1979. Life for gays in America is better, but not yet as good as it is for heterosexuals.. That theme is underscored in Falsettos, that has music and lyrics by William Finn and book by Finn and James Lapine. Today, though, Falsettos is more than a good play; it is an historic look backwards at the suddenly open lives of gay men and women and the troubles they were besieged by, legal, cultural and medical, in the early 1980s.

Falsettos is a triumphant, but not an altogether happy, tale. Marvin does not find true love with his new boyfriend in 1979. They split up and he is left stranded because his wife has moved in with the psychiatrist. Marvin has lost her and has nearly lost his son. 

 

Trina, the wife, who faces more problems than the Biblical Job, wants the kind of straight, traditional loving relationship with the psychiatrist that she so badly desired with Marvin, but did not get. She is looking for a safe harbor in a mixed-up world and finds it – somewhat. 

 

The songs in the play are good and help to tell the story. They also add nuances to the play. As an example, the opening number is a very hip “Four Jews in a Room Bitching,” a very witty tune, and it sets up what you think is going to be a funny play, which is it, for a while. Later songs, such as “I’m Breaking Down” and “Something Bad Is Happening,” give the play its serious side.

 

Falsettos is a fine show, but has its problems. It is a good twenty minutes too long. The performance I saw ran nearly an hour and forty-five minutes. There are several songs in the first act that are redundant and could be cut along with a scene or two. In fact, here are 37 songs in the play, way too many. This is a play, not an opera. The first act also drags a bit here and there and the storyling gets lost. The second act is much better. Not only is the story tighter, but far more dramatic and introduces two gay women to the storyline. The second act also expands the plot from marital woes to social and cultural troubles for this torn family and brings in the medical woes that gays faced in the 1980s.

 

Director Daniel Krane has overcome these problems, though, by offering up a fast-moving story that is heavy on emotion. He has a fine cast that works well as an ensemble. Its members also shine in their work as individuals. They are Michael Rosas as Arvin, Justin Ramos as Mendel, Dylan Blau Edelstein as boyfriend Whizzer, Hannah Chomiczewsi as Jason, Chamari White-MInk as the doctor and Michell Navis as Cordelia. Their neat dance numbers are choreographed by  Jhor Van Der Horst.

 

This thirty year old musical about gay life is decidedly not worn out or dated. It is as vibrant today as when it first debuted  in New York.

 

What’s interesting about it is that today the play offers a nice historical window for people to look back at the gay rights crusade, and all of its triumphs and tragedies, by viewing a play that debuted right in the middle of it.

 

Much is different about gay life today than in 1979. Marriages between gays are legal now,, as an example. Yet many of the troubles homosexuals faced back in the ‘70s still exist. Finn and Lapine could rewrite this play set in 2019 and the plot and characters would pretty much remain the same. It would still be a powerful punch in the stomach tale, though.

 

PRODUCTION: The play is produced by the Princeton Summer Theater. Sets: Jeffrey Van Velson, Costumes: Jules Peiperi, LIghting: Megan Berry, Sound: Tashi Quinones. The play is directed by Daniel Krane. It runs through June 30.

]]>
Wed, 17 Jul 2019 08:31:05 +0000 https://historynewsnetwork.org/article/172327 https://historynewsnetwork.org/article/172327 0