History News Network - Front Page History News Network - Front Page articles brought to you by History News Network. Thu, 01 Oct 2020 10:22:09 +0000 Thu, 01 Oct 2020 10:22:09 +0000 Zend_Feed_Writer 2 (http://framework.zend.com) https://hnn.historynewsnetwork.org/site/feed The Future of History in the Pandemic Age

Reading Room of the Maritime Research Center, San Francisco

 

 

 

Attempting to predict the future is always perilous, and events frequently humble those who dare to try. Making predictions is especially hazardous for historians, who often struggle to explain the past. Peering into the future is not part of their professional training, and their efforts to do so are likely to fail.

 

But the need to prepare for an uncertain future forces us to think ahead. For historians, the most immediate question is how COVID-19 will affect the profession of history. Though the answer depends on several things—the timely advent of a safe, affordable, and effective vaccine being chief among them—a number of changes to the profession are almost certain to occur.

 

One big problem facing historians is that conducting research will become more time-consuming. Unlike scholars in some other fields, historians normally have to travel to conduct their research. Archives will likely institute hygiene-related protocols, which will increase the time historians must wait to gain access to documents. Moreover, reading rooms in archives may admit fewer researchers at a time, which would involve further delays. These and other COVID-related measures could significantly slow the pace of archival research.

 

The additional time needed to do research will make it more expensive as well. Archives are often located in major cities, where the cost of living can be quite high. Though the current lower cost of short-term lodging reflects a diminished demand, prices will probably rise once the pandemic is under control. This could result in scholars spreading out research trips over multiple years, thus lengthening the time needed to complete dissertations, books, and articles. The tenure clock might have to be extended to compensate for this longer research process. Graduate students, who generally have to make do with extremely tight budgets, will be especially hard hit.

 

To compensate for shorter research trips, historians will undoubtedly seek technological solutions. The advent of the digital camera has already revolutionized research. Using a digital camera allows a researcher to photograph literally thousands of documents in a single day. Though many historians continue to do research the old-fashioned way—taking notes—they might now pick up a digital camera instead. This move could change how the profession trains graduate students in the methodology of historical research.

 

Closely related to this issue is the possibility that the pandemic might also push R1 history departments to move away from requiring faculty to publish books in order to receive promotion and tenure. Publishing articles, which is the standard in social sciences such as economics and sociology, may become the norm. Writing a history book often involves traveling to many archives, some of them distant from the scholar’s home base. The process can take years. However, articles can be written in much less time and without as much archival research. Diminished research budgets and the health risks associated with extended travel could persuade history departments to accept articles for promotion and tenure in place of the traditional monograph.

 

The pandemic will almost certainly accelerate some trends that are already underway. The number of students majoring in history has been declining for years. Majoring in history in a more uncertain economic environment might be seen as an unaffordable luxury for all but the wealthy. Rather than follow their academic passion, students may instead pursue majors that appear to provide better job prospects. The STEM fields (Science, Technology, Engineering and Mathematics) will probably be one of the main beneficiaries of this shift away from the humanities.

 

To survive this trend, history departments may redirect their scholarly emphasis. Rather than focusing on more traditional courses, we could see departments offering more courses on subjects such as the history of medicine and disease in order to remain “relevant.” History departments might also offer fewer courses in the realm of public history (oral history, archival studies, conservation, historic preservation, digital humanities), even though that is how historical records are preserved. The institutions that hire public historians, such as archives and museums, are suffering financially in the current time of upheaval, so they may be doing less hiring in the future.

 

History departments might also rethink their opposition to military history and diplomatic history. Both fields are seen as bastions of great man history, which has long been reviled in academic circles. Despite these criticisms, they remain a big draw for students. Another reason history departments might welcome more emphasis on military and diplomatic history is that the resurgence of nationalism makes expertise in these areas more relevant than at the peak of the trend toward globalization. History departments may lean in a more “practical” direction to satisfy the demand for relevancy and decide to offer more courses in these fields.

 

Other potential changes will affect all of higher education, but historians specifically will need to think about them. Teaching has already changed, at least in the short term. Professors are on campus less often than in pre-pandemic times. Historians may prefer to teach and meet students virtually rather than be on campus and risk becoming infected. To mitigate such concerns, campuses will undergo physical changes. Any new classrooms built will be larger in order to accommodate the same number of students, allowing students and professors to practice social distancing. 

 

Departments will also have to alter their standards for evaluating the effectiveness of teaching. The migration from traditional face-to-face teaching to online instruction due to the pandemic will likely push departments to make virtual courses part of their permanent offerings. Yet remote learning is significantly different from the face-to-face learning experience. Teaching evaluations will have to change to account for this difference. On a related note, departments would be wise to offer their faculty and graduate students formal and informal courses on the best online learning practices.

 

Many colleges and universities depend heavily on room and board and tuition to fund operations. But the pandemic is already causing fewer students to live on campus, and as a result they (and their parents) are demanding lower tuition rates. This will mean less money for many academic departments. History departments will have to figure out how to avoid draconian budget cuts. The main beneficiaries of the competition for funds will undoubtedly be the STEM-related fields, which have already gained in popularity and influence. Public and private sector entities have for years been redirecting funds from the humanities to STEM. Moreover, because graduates in these fields command much higher salaries than those in the humanities, parents prefer that their children major in STEM-related fields— no doubt as a hedge against an uncertain economic future.

 

Changes in higher education precipitated by the pandemic could have a significant impact on the number of people who are actually working as historians and doing historical research. Professors who fall into one (or more than one) risk group might take early retirement, and universities might be inclined to go along with them, given declining enrollments. This could open the door for younger scholars to join the professorial ranks. However, because universities have been relying more heavily on adjuncts, new PhDs may not be offered a tenure track position when they are hired. The long-term decline in tenure hires is accelerating due to COVID-19.

 

Academic departments will not be the only campus institutions to experience cuts; libraries will as well. Building collections in specific areas is already difficult. This will become much harder in the immediate future.

 

Although recent racial unrest has prompted some history departments to offer more courses on African American history, ironically college campuses could become racially less diverse. The disproportionate impact of COVID on African American communities may also mean it disrupts college plans more for Black than for other students. Some Black students may think twice about going to college, at least in the near term. And if they do go, they might decide to attend HBCUs, which are less expensive and can often provide a more nurturing cultural environment. Offering courses on African American history, slavery, and colonialism to classes that are overwhelmingly white will provide awkward visuals for history departments attempting to position themselves at the forefront of diversity and inclusion efforts.

 

There will almost certainly be less immigration in a pandemic environment. Most wealthy nations have been limiting immigration for years, and the COVID-19 pandemic has reinforced this trend. It’s virtually certain that fewer foreign scholars will be visiting the United States. International programs, which exist on most university campuses, will find fewer students who can or want to travel overseas. Continuing education programs designed for senior citizens, which many history departments take part in, could become a thing of the past. College and university campuses will probably be much less diverse environments in the future.

 

There is no doubt that the longer the pandemic persists, the more probable it is that changes will occur and that some of them will be permanent. Historians need to prepare for these coming changes as much as anyone else. The longer they wait, the harder they will find it to adapt to change.

 

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177530 https://historynewsnetwork.org/article/177530 0
The Troubling History of a Black Man's Heart

 

 

To better understand the longstanding inequalities and injustices still plaguing America’s healthcare system, it helps to travel back in time to ancient Greece and heed the warnings of that early medical ethicist, Plato.

“The greatest mistake in the treatment of diseases,” he said, “is that there are physicians for the body and physicians for the soul, although the two cannot be separated.”

Similarly, the greatest mistake in trying to heal long-held suspicions of American medicine held by Black Americans is to forget that the heart of good medicine beats with trust.

I was reminded of this historical legacy last spring when some Black Americans said they were reluctant to accept free testing for the novel coronavirus. One young man near where I live in Richmond, Virginia told the Washington Post that his older employer had discouraged him from getting tested. “This might be a Tuskegee-type trap,” he told him, referring to the notorious study of Black men in Alabama that left them suffering from syphilis. Sadly, although penicillin became available to treat the disease, the men were allowed to suffer and sometimes die during a four-decade-long study that didn’t end until 1972. 

As someone in college in the 1970s, I found it astounding that such a dark chapter in American medical history -- smacking of racism, eugenics and Hitler-era torture – was still happening. It would take another quarter century before President Bill Clinton issued a formal apology in 1997.

More apologies – and perhaps reparations – may lie ahead as journalists and historians keep turning up secrets from our not-so-distant past. My own eyes were opened to the legacy of mistrust in communities of color during the three years I spent investigating the tragic saga of a black man who entered a major hospital in Richmond in 1968 during a period not unlike today—marked by street protests, police brutality, and racial polarization after the assassination of Dr. Martin Luther King.

On the evening of May 24, 1968, a factory worker named Bruce Tucker was celebrating the end of his workweek by having drinks with friends. As he chatted with them, he lost his balance and fell from a wall. He sustained a serious head injury and, after initially resisting efforts to help him, was rushed to the emergency room of what was then called the Medical College of Virginia (today it’s VCU Health).

In my new book, The Organ Thieves: The Shocking Story of the First Heart Transplant in the Segregated South, I argue that the intense, worldwide competition to perform heart transplants during the 1960s set the stage for the wrongs done to this seriously injured patient and to his family. For at the same moment MCV’s transplant surgeons got the official approval to remove life support, Mr. Tucker’s own family and friends were frantically searching for him.

As two leading transplant ethicists, Robert M. Veatch and Lainie F. Ross, wrote in their authoritative 2015 book, Transplantation Ethics, “Many interpret the case as establishing a brain-oriented definition of death for the state of Virginia, but it could equally be viewed as the first case where a ventilator is disconnected for the purpose of causing the death of the patient, by traditional heart criteria, in order to procure organs for transplant.”

In 1968, unlike today, there was no effective national organ procurement system in place in the state-owned hospital where Tucker breathed his last.  This essentially gave free rein to the transplant surgeons to proceed with what they thought was best at the time for a seriously ill white patient: that is, to remove Tucker’s beating heart and place it in the man’s chest (the white businessman, Joseph Klett, survived for only eight days after receiving the heart, which was either “donated” or “stolen,” depending on one’s point of view).

This experimental operation, with its deep racial overtones, led to the first “wrongful death” lawsuit brought over a heart transplant in the U.S.  As Veatch and Ross commented, “The doctors were exonerated, but it is unclear whether the court held that the doctors had the authority to turn off the ventilator on a still-living patient in these circumstances or whether it held that Tucker was already dead based on brain criteria.”

Based on my own reading of court and archival records, first-person interviews with surviving doctors, lawyers and some Tucker family members, here are a few more salient points about a case that led to the passage of Virginia’s first law recognizing the concept of “brain death” in 1973.

Though Bruce Tucker was still awake upon admission to the hospital with normal vital signs, the 54-year-old factory worker fell into a coma. With that, he was quickly identified by the night staff as a potential “donor” in what would be the first heart transplant in Virginia history.  

The pressure to attempt this still-experimental procedure was intensified by the fact that only six months earlier a recent guest at MCV  -- Dr. Christiaan Barnard of South Africa – had won the heart transplant race. This achievement in late 1967 turned Barnard into an overnight celebrity. The acclaim began to gnaw at the Richmond transplant surgeons – especially chief of surgery David Hume, who’d invited Barnard to MCV. In the afterglow of the South African’s stardom it’s easy to see why the Virginians felt left out of one of the greatest feats in the medical history.

Against this backdrop of competition, professional rivalry, and institutional aspiration, the odds for equal treatment were stacked high against a black man with a head injury and alcohol on his breath. Bruce Tucker was easily seen as another “charity patient” who wouldn’t pay his bills, as one of the doctors who was on the scene later told me.

Tucker’s social invisibility, as other scholars have noted, led to the conclusion by the hospital staff that this Black man with no one coming to claim him had little chance of recovery. However, with his heart beating strongly and other vital signs normal, he was quickly seen as a prime candidate to be an organ donor.

In the end, the decision to take out Bruce Tucker’s heart—along with his kidneys—was approved by a junior medical examiner on duty that fateful weekend.  Adding to the family’s sense of loss was the frantic search by William Tucker, a nearby storeowner, to find his older brother before anything was done to him. Unfortunately, the cobbler arrived a few hours too late. It would be left to a local funeral director to break the gruesome news about Bruce’s missing heart and kidneys.

The Organ Thieves shines a bright light on this long- forgotten medical drama that played out on a stage filled with racial discrimination, political and cultural polarization. It raises unsettling and still-unresolved questions about how someone could be treated in such a cold-hearted way.

While American medicine has improved a lot since then – for example, by ensuring “prior consent” before organs are donated – many of the injustices described in my book can be seen today when Black Americans share their own sense of suspicion and mistrust of the health care system, particularly during the pandemic.

Since my book has been published, I’ve been heartened to hear from readers and reviewers alike that The Organ Thieves has revealed a long-forgotten chapter in the history of American medicine. This, in turn, has fostered important conversations about working to bring long overdue changes to our health care system.

For nobody deserves to be treated like Bruce Tucker.

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177442 https://historynewsnetwork.org/article/177442 0
Debates Are Unpredictable

 

 

 

If you think you know what will happen in the coming presidential debates, think again. They’re always unpredictable.

 

The first Kennedy-Nixon debate in 1960 is a good example.

 

Most pundits expected Richard Nixon to shine. Not only did he have eight years of experience as vice president, he also had a reputation as a skilled debater. But when the camera blinked on, Nixon seemed pale and weak. John F. Kennedy appeared cool, tanned and in command. Nixon’s one-point polling edge before the debate turned into a three-point lead for Kennedy after the debate. JFK went on to win the election by a hair.

 

In 2000, observers expected Vice President Al Gore, a capable debater, to wipe the floor with Texas Governor George W. Bush, who was not known for his fluency. But that didn’t happen. In their first showdown, Gore sighed, huffed and puffed, and left a bad impression. His pre-debate lead was wiped out. After all three debates, it was Bush who emerged in first place.

 

Americans like debates. They offer voters a chance to see candidates side-by-side. They motivate supporters and serve as tiebreakers for undecided voters.

 

But the effect of a debate can be temporary. In 2012, Mitt Romney delivered a superb performance in the first debate against President Obama, who was surprisingly flat. Romney erased Obama’s lead and opened a path to victory. But, it didn’t last. Obama’s much improved performances in subsequent debates put the incumbent back on top.

 

Debates are lost, not won. That was apparent in 1976, when President Ford intimated that Eastern Europe was not under Soviet domination. It was a big blunder that played into the negative stereotype of Ford as dimwitted. Jimmy Carter didn’t win that debate, Gerald Ford lost it — and with it the election. Self-inflicted injuries are the worst kind.

 

Debates can make voters more comfortable with candidates, and lessen doubts. Smart combatants use them to identify with the nation’s zeitgeist.

 

In the closing days of the 1980 election, Ronald Reagan needed to show he had the substance and stability to be president. When viewed on stage alongside President Carter, the former actor and California governor presented the right contrast: strength versus weakness, change versus status quo. He did it by what he said and how he said it.

 

Debates have big audiences with gigantic media buildups. They work best for candidates who use them to clarify the choice that voters are about to make. They offer valuable opportunities to sharpen messages and sort out issues that have become jumbled in the fog of campaign warfare. That’s why the question Reagan posed at the end of his debate with Carter—“Are you better off than you were four years ago?”—was so effective.

 

Let’s remember, too, that a candidate’s principal task is not to win the debate, but to win the election.

In 1996, President Clinton was favored for re-election. He needed to make sure the debates did not alter the dynamics of a race that was going his way, even if it meant pulling punches. While his opponent, Sen. Bob Dole, delivered strong performances, Clinton achieved his objective of protecting his lead — and won the race.

 

Underdogs, however, need to shake things up. They can do it by unsheathing new issues and attacks.

 

Looking toward the big event next week—Americans are fixated on whether Joe Biden will falter or stumble. Literally. If he does neither, he wins round one of the expectations game. If the former vice-president stands strong throughout all three debates, he could tighten his grasp on the lead. But, if he’s put on the defensive and seems weak or disoriented, he could do himself serious damage.

 

Expect Trump to hammer Biden for a range of left-wing policy ideas that have been embraced by Democrats. Expect Biden to go after Trump on the coronavirus and healthcare, emphasizing the current administration’s uncaring and reckless policies.

 

Trump is full of surprises. In debates, surprise can kill––putting the target, and sometimes the aggressor, at risk. As the underdog in the polls, Trump has less to lose. He needs to change the dynamics of the race. He will likely position himself as the only bulwark against a takeover by the radical left. While the president could throw Biden off track with vicious attacks––on everything from China, tax increases and open borders to son Hunter’s business dealings in the Ukraine––he could also open himself up for tough counterattacks.

 

Viewers will be watching Donald Trump, as they always do, with anticipation. For supporters, it will be an opportunity to reinforce rock-solid devotion. For opponents, it will be an opportunity to reinforce deepening revulsion. For the eight to ten percent of the electorate that is cross-pressured––that is, voters who aren’t happy with either candidate––it may be an opportunity to actually make a choice

 

Anything goes in this election. And in the upcoming debates, anything can happen. The stakes couldn’t be higher. 

 

 

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177528 https://historynewsnetwork.org/article/177528 0
Anti-racist Lens Distorts History on New Jersey “Freeholders”

 

 

I grant there is no compelling reason for New Jersey’s counties to retain the traditional term “chosen freeholders” as the name for their elected officials. In a bill signed into law by Governor Phil Murphy on August 21, the title of these lawmakers will become “county commissioners” at the beginning of 2021. The new term certainly conveys better than “chosen freeholder” what these elected representatives do.

But there is little basis for tying the older term to the history of slavery or racial prejudice, as many of New Jersey’s political leaders have done. Governor Murphy, for example, lent his support to the legislation by tweeting, “let us tear down words born from racism”. State Senate President Stephen Sweeney (D., Gloucester) claimed the title “is mired in the language of slavery”. And Felicia Hopson, Director of Burlington County’s Board of Freeholders, linked retiring the term to the goal of “[c]ontinuing our work to end systemic racism…by eliminating an antiquated title from an era when slavery and racism [were] tolerated…”. 

The term “freeholder,” first brought to the American colonies from England in the early seventeenth century, meant only a person who owned land (or other property) free of debt. The holding did not have to be a particularly large estate; by the mid-eighteenth century farms as small as half an acre were likely adequate to qualify. The idea was that such people, by virtue of their property ownership, would have the economic independence to be free from the influence of more powerful figures and could therefore be trusted with the vote. The “chosen freeholders” were simply the people selected by the freeholders at large to make the administrative decisions for a county until the next election came around.

The freeholders had a profoundly positive effect on the early development of liberal democracy, and nowhere more so than in New Jersey. A remarkable document, “The Concessions and Agreements of the Proprietors, Freeholders and Inhabitants of the Province of West Jersey in America” (1677), for example, established for the new settlements around Burlington the principle of rule by consent of the governed. Signed by 150 individuals, this early constitution contained a bill of rights, guaranteed religious liberty, and proclaimed it had “put the power in the people.” On its basis, the province’s first representative assembly, elected by the freeholders, convened at Burlington in 1681. Two other elected assemblies had begun even earlier in East Jersey. Once East and West Jersey came together to form the Crown colony of New Jersey in 1702, the freeholders continued to stand up for their rights against the royal governor and his council all the way until the American Revolution.

Who were the freeholders? Certainly, nearly all were men. And because Europeans had founded the colonies, the freeholders were overwhelmingly white. By the mid-1700s, Black Africans comprised about 7% of New Jersey’s population, the great majority of whom were enslaved, including by some of the freeholders.

But another unique feature of New Jersey’s history points to a way in which the ideal of freeholder democracy challenged even these limitations. New Jersey holds the distinction of being the only state, just after the start of the American Revolution, to have allowed both some white women (single and with a certain amount of property) and some Black men and women (those who were free, owned property, and, if female, unmarried) to vote. This unusual development in the history of American suffrage, which lasted for about thirty years, began without fanfare, indeed without any special notice at all – which in turn suggests that single, propertied women, both free Black and white, and free Black men of property had likely joined the ranks of the freeholders for some stretch of years prior to the Revolution.

The world of the colonial period was not the same as ours today. Their world was one based on a principle of social hierarchy that remained largely unquestioned. Racial distinctions, at least in the northern colonies, did not lie at the center of this social system. A sizable minority -- including the wealthy and the middle-class freeholders – occupied positions of independence. Beneath them stood a number of dependent classes: married women, tenant farmers, wage workers, servants, slaves, and the poor. And just as today we cherish the principle of freedom from arbitrary arrest (what came to be known as habeas corpus) that a group of English lords, who probably cared little about anyone other than themselves, won from their king back in 1215 with the Magna Carta, we can similarly pay tribute to the significant, if still limited, gains the freeholders of New Jersey made toward the expansion of popular participation in government.

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177443 https://historynewsnetwork.org/article/177443 0
Where Does the Democratic Party Stand on War, Peace, and International Relations?

 

 

After nearly four years of the Trump administration, U.S. voters have a pretty good idea of the policies that the President and his Republican allies champion when it comes to America’s dealings with other nations.  These policies include massive increases in military spending, lengthy wars abroad, threats of nuclear war, withdrawal from climate and nuclear disarmament treaties, a crackdown on refugees, and abandonment of international institutions. 

But what about the Democrats?  Do they, as some have charged, simply mirror the Republicans when it comes to America’s engagement with the world?  The official Democratic Party platform, adopted this August at the Democratic national convention, provides a useful answer to this question.

The foreign affairs section of the platform opens with a sharp rebuttal to Trump’s belligerent, nationalist approach.  Challenging militarism, it pledges to “use force only when necessary, always as a last resort, and with the informed consent of the American people.” It also promises to draw upon international partnerships and institutions to “meet common challenges that no country can face on its own.”

The platform’s discussion of U.S. military policy is particularly striking.  “We need to bring our forever wars to a responsible end,” the document states.  “Our military engagements, which have spanned from West Africa to Southeast Asia, have cost more than $5 trillion and claimed more than half a million lives. Our war in Afghanistan is the longest war in American history.”  Thus, “it’s time to bring nearly two decades of unceasing conflict to an end.”

Accordingly, the platform calls for a peace settlement in Afghanistan, termination of U.S. support for the Saudi-led war in Yemen (a war that “is responsible for the world’s worst humanitarian crisis”), and for applying the lessons learned from these disastrous conflicts.  This means, among other things, that “we will work with Congress to repeal decades-old authorizations for the use of military force and replace them with a narrow and specific framework that will ensure we can protect Americans from terrorist threats while ending the forever wars.”  The platform adds: “Rather than occupy countries and overthrow regimes to prevent terrorist attacks, Democrats will prioritize more effective and less costly diplomatic, intelligence, and law enforcement tools.”

In line with this new approach, the platform calls for cutting the Trump administration’s bloated military budget—what it calls, in typical Washington-speak, “restoring stability, predictability, and fiscal discipline in defense spending.”  In justification, the platform notes that “we spend 13 times more on the military than we do on diplomacy. We spend five times more in Afghanistan each year than we do on global public health and preventing the next pandemic. We can maintain a strong defense and protect our safety and security for less.”

The platform also pledges that Democrats will initiate other reforms in the U.S. military. These include efforts to halt “the Trump administration’s politicization of the armed forces,” root out sexual assault within their ranks, and safeguard “the independence of the military justice system—not pardon war criminals.”  

Promising to “revitalize American diplomacy,” the platform argues that, “rather than militarize our foreign policy,” the Democrats would make diplomacy “our tool of first resort.”  Under a Democratic administration, the U.S. government would rejoin the World Health Organization, the UN Human Rights Council, and the UN Population Fund and seek to modernize international institutions.  Championing foreign assistance and development programs, the platform backs U.S. “investments in the prevention and alleviation of poverty, hunger, disease, and conflict,” and “the empowerment of vulnerable and marginalized populations.”  It also promises that “Democrats will lead international efforts to help developing countries withstand and recover from debt crises causes by the COVID-19 pandemic.” 

Indeed, the Democratic platform sharply rejects the narrow nationalist approach of the Trump administration.  It contains strong commitments to acting cooperatively with other nations to ensure global health (for example, by restoring the U.S. role as the leading funder and technical partner of the WHO), battle climate change (by rejoining the Paris Climate Agreement and developing more ambitious global goals to reduce greenhouse gas pollution), utilize technology for the public good (by maintaining an open internet), and expand the admission of refugees.  In yet another attempt to respect the rights of other nations, the platform promises to move the U.S. government’s “relationships in the Middle East away from military intervention” and to end the cruel policies of the Trump administration toward Cuba and Venezuela.  

In line with this decreased emphasis on military might and increased emphasis upon international cooperation, the platform states that Democrats support the “elimination” of chemical, biological, and nuclear weapons.  They favor “reducing our overreliance and excessive expenditure on nuclear weapons” and declare that “the Trump administration’s proposal to build new nuclear weapons is unnecessary, wasteful, and indefensible.”  Furthermore, “Democrats commit to strengthening” the nuclear Nonproliferation Treaty, “maintaining the moratorium on explosive nuclear weapons testing, pushing for the ratification of the UN Arms Trade Treaty and Comprehensive Test Ban Treaty, and extending New START.”  Moreover, they would “work with Russia” to “negotiate [nuclear] arms control agreements . . . and move the world back from the nuclear precipice.”

Admittedly, the 2020 Democratic platform also contains occasional flag-waving rhetoric and a number of positions that are bound to irk at least some critics of Trump’s policies.  Also, of course, a party platform is a statement of policy preferences—not a guarantee of their implementation.

Even so, when it comes to war, peace, and international relations, the Democratic Party has outlined a program significantly different from that of its Republican counterpart.  In this November’s elections, American voters will have a clear choice as to what kind of role they want their country to play in the world.

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177444 https://historynewsnetwork.org/article/177444 0
An Open Letter to Congressman French Hill on the 1919 Elaine Massacre and the Dangers of “Patriotic” History

 

 

Dear Congressman Hill,

I have been contemplating for some time whether or not I should to respond to your letter of July 15, 2020, in which you praised me for winning the John William Graves Book Award from the Arkansas Historical Association and expressed a certain interest in reading the revised edition of Blood in Their Eyes: The Elaine Massacre of 1919, of which I am a co-author. At the time I received it, this letter seemed to me some kind of sick joke, and as the months have passed, my sense of the inappropriateness of your letter has only grown. I hope that you will allow me to explain my meaning.

In mid-July, you will remember, the United States was still experiencing ongoing protests against police brutality—especially as directed toward African Americans—that erupted in the wake of the murder of George Floyd, who was choked to death in public so efficiently he may as well have been hanging from a noose. These protests were met with new waves of violence as police beat, shot, and tear-gassed citizens across the nation, and soon unidentifiable federal troops were systematically kidnapping Americans in places like Portland, Oregon, based upon rumors of “Antifa” affiliation. Donald Trump ordered an attack upon unarmed civilians at Lafayette Park in Washington DC, before curfew had fallen, so that he might pose with a bible in hand, and just a few days later, Senator Tom Cotton published a New York Times editorial demanding military intervention to “restore law and order.” In the midst of all of this, armed white supremacist radicals began appearing at these protests either to carry out murders that might be pinned on Black Lives Matter (as they did in California) or to attack the protestors themselves. 

During all of this, you never wavered in your faith in—and support of—Donald Trump. And so you can, I hope, understand why I was confused that you should so praise me for having produced two books related to the Elaine Massacre. The massacre, you might recall, was precipitated by the efforts of local African American sharecroppers in Phillips County to form a union and file suit against plantation owners for just recompense for their labor, as sharecroppers were regularly subject to peonage, or debt slavery, and abuse and exploitation of a thousand other varieties. Local authorities learned about the union, and two white law enforcement personnel, along with a black trusty, drove to the Hoop Spur church on the night of September 30, 1919, just as union members were meeting. The Hoop Spur church was shot up, and one of the white lawmen killed in an exchange of gunfire. In the hours that followed, rumors of a “negro insurrection” began to spread, leading the Phillips County sheriff to deputize American Legion members, who swiftly joined the mobs flooding into the area, killing everything in their path. Leaders in Helena called upon Governor Charles H. Brough to send federal troops to “restore law and order,” and he did, but these soldiers likely participated in the violence, killing African Americans with impunity. In the chaos, mobs and deputies and the military would kidnap black people and subject them to torture to make them confess that their ultimate plan was the murder of every white person in Phillips County. The violence left an unknown number of African Americans dead—a body count that may have reached into the hundreds.

So why, exactly, did you write me a letter praising me for bringing this kind of history to light? The policies enacted in 1919 are policies that you support right now. You and the president whom you serve have been working overtime to discredit any protests, peaceful or otherwise, by those who have long been subject to police brutality. You and the president whom you serve have long supported the sort of economic policies (such as tax cuts) that leave a handful wealthy and the workers in the fields starving and in rags. You and the president whom you serve have invited armed racist militants to act as “poll observers” to intimidate American citizens into not voting. 

Moreover, just last week, your president announced, at what he called the “very first White House Conference on American History,” the establishment, by executive order, of the “1776 Commission,” which he pitched as an attempt to rescue history from modern professors, saying, “Our mission is to defend the legacy of America’s founding, the virtue of America’s heroes, and the nobility of the American character. We must clear away the twisted web of lies in our schools and classrooms, and teach our children the magnificent truth about our country. We want our sons and daughters to know that they are the citizens of the most exceptional nation in the history of the world.” Among the examples of “toxic propaganda” Donald Trump excoriated were the field of critical race theory and the 1619 Project of the New York Times. 

In the light of Donald Trump’s executive order, I am even more perplexed by the kind letter you sent me, in which you declared me “a credit to Arkansas’s Second Congressional District” and praised my efforts in “preserving, writing, publishing, teaching and understanding Arkansas state history.” After all, aside from the books of mine you praise in your letter, I have also written a monograph on racial cleansing in Arkansas and edited a book on lynching in the state. In addition, I have a few books forthcoming that continue this analysis of racial violence in Arkansas. I believe that the best way to advance as a civilization is to understand the complex history that has resulted in the numerous social divisions present in our society, for by properly understanding the cause, we might effect a cure. However, the president whom you serve insists, rather, that books such as mine are “propaganda tracts… that try to make students ashamed of their own history.” What do you believe? Do you think me a credit to the district due to my efforts in the field of history, or do you think me a left-wing anarchist attempting to destroy our nation? Please let me know.

You may be familiar with the biblical passage of Matthew 6:24, in which Jesus says, “No one can serve two masters; for either he will hate the one and love the other, or he will be devoted to one and despise the other.” In the past, you have shown some devotion to the field of history. You even received the Arkansas Historical Association’s Tom Dillard Advocacy Award in 2018. However, you have also persisted in being a good and faithful servant to Donald Trump in everything he does. Does this mean that, with the establishment of Donald Trump’s 1776 Commission, you have renounced your love of Arkansas history and your appreciation of the historians who have been writing and teaching it? Please let us know.

I truly do despair at what Donald Trump’s demands for patriotism above everything mean for the field of American history. After all, we know how white patriots at the time of the Elaine Massacre were interpreting events. Just a few short days after the massacre started, one state newspaper described its origins thusly: “Vicious Blacks Were Planning Great Uprising.” Is this what we must teach in the future, Congressman Hill? Will historians like myself be condemned or canceled for departing from the party line that what happened at Elaine was anything but a “negro insurrection”? 

I look forward to hearing from you.

Sincerely,

Guy Lancaster, Ph.D.

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177534 https://historynewsnetwork.org/article/177534 0
Inspiration from the Banks of the Indus River: A Conversation with Nibir K. Ghosh

 

 

It’s been a season of uncertainty and dread in the United States as we contend with a deadly global pandemic, a reckoning with centuries of racism, bitter political divisions, historic environmental disasters, political corruption, and an unraveling of national institutions, among other challenges.

For another perspective on history and for words of encouragement, I consulted distinguished Indian author, scholar, editor, and public intellectual Professor Nibir K. Ghosh, a recognized and reliable source for knowledge, wisdom, and inspiration. 

In a lively dialogue by email, we recently discussed Professor Ghosh’s background, his literary study and works, and his thoughts on history and current events. He generously shared his views on the situation in India and its history as that huge nation now struggles with COVID-19 and approaches the number of cases and deaths that the US grimly has attained. With his background in American studies and his academic work in the US, his insights on our history and culture are particularly astute and timely. 

Beyond the sweep of history and this fraught moment, Professor Ghosh shares insights on the writers and thinkers he studies. His new collection of essays and other writing, Mirror from the Indus, is a treasure trove of his words and wisdom with timeless relevance. For instance, note the resonance now of his comments on the lives and work of Mahatma Gandhi and Dr. Martin Luther King, Jr. And, Professor’s Ghosh’s vision of the interconnectedness of all people in One World without discrimination is particularly instructive and inspiring today as both of our nations face the future with anxiety, ambivalence, and guarded hope.

Dr. Ghosh, D.Litt., is a UGC [University Grants Commission] Emeritus Professor and former Head, Department of English Studies & Research, Agra College, Agra, India. An eminent scholar and critic of American, British and Post-Colonial literatures, he has published over 180 articles and scholarly essays on various political, socio-cultural and feminist issues in reputed national and international journals.  

Professor Ghosh is also the founder and chief editor of Re-Markings (www.re-markings.com), an international biannual journal of research in English. Besides Mirror from the Indus, Professor Ghosh is the author of 14 other acclaimed books including Gandhi and His Soulforce Mission; Charles Johnson: Embracing the World; Multicultural America: Conversations with Contemporary Authors; Calculus of Power: Modern American Political Novel; Shaping Minds: Multicultural Literature; W.H. Auden: Therapeutic Fountain; and Perspectives on Legends of American Theatre.

Professor Ghosh was awarded the prestigious Senior Fulbright Fellow at the University of Washington, Seattle during 2003-04. He is currently on the Review Panel of Multi-Ethnic Literature of the United States (MELUS) published by the University of Connecticut, and the African American Review, the Quarterly International Journal on Black American Literature and Culture of the Modern Language Association. During 1992-96, he was the Executive Member of the Board of Directors for the American Studies Research Centre (ASRC) in Hyderabad. Funded by the US government, the Centre was one of the most important institution for American Studies outside of the United States. For two consecutive terms, Professor Ghosh was elected to the ASRC Board with an overwhelming majority, an unprecedent achievement in the history of the organization.

In 2018, The Osmania University Centre for International Programmes, Hyderabad, conferred upon Professor the Lifetime Achievement Award during the Centennial celebrations of Osmania University.

 Professor Ghosh’s blogs www.nibirkghosh.blogspot.com and www.elsaindia.blogspot.com reflect his curiosity and his constant engagement with society, polity, culture, and more.

 

Robin Lindley: It’s a pleasure to hear from you Professor Ghosh. Congratulations on your fascinating new book Mirror from the Indus. You’re a distinguished professor and scholar of literature and history as well as a public intellectual in India. Did your family and early schooling lead you to your career pursuits? Did you have some especially influential teachers and professors?

Professor Nibir Ghosh: Thank you for your keen interest in my work, especially in Mirror from the Indus.

When I look back from the vantage point of the present moment at my early childhood, I can easily recall how fond I have always been of reading for pleasure and wisdom. To my father, who served in the Indian Air Force, I truly owe the incessant urge to fall in love with words and ideas that came from reading fairy tales, illustrated comics, classical tales of adventure, stories of revolutionary heroes, pictorial books on Indian History etc. that I would receive from him as gifts. From my mother’s zeal in performing her pujas (worship), I developed an early interest in spiritual stories. As a student my favorite subjects were English, Science and Mathematics. I had my initial schooling in Air Force School in Delhi before the family moved to Agra where I joined the Air Force Central School.

In those days it was customary for bright students to have either Engineering or Medicine as appropriate career options. I fondly remember how my English teachers would always encourage me to participate and represent my school in debate, essay writing and elocution contests. When I stood first in V class, our principal, Mrs. I. Montes, gifted me King Arthur and the Knights of the Round Table retold by Phyliss Briggs, a story that impressed me a great deal. However, you will be surprised to know that teaching as a career never figured in my wildest imagination. After my schooling, I did my graduation in Science from Agra College, Agra (founded in 1823 during the British rule). While doing my graduation I joined the Coca Cola company as a Chemist for a while. 

It was after my graduation that I was in dilemma whether to go in for M.Sc. in Physics or for M.A. in English literature. After a good deal of deliberate reflection, I finally opted for the latter. Though I had been reading literature for enjoyment for years, the post-graduation course opened a completely new universe because I was able to see in what I read the inevitable connection between literature, history, society and polity of numerous nations and cultures. When my name appeared in the merit list of the University, I began to receive offers of appointment as a Lecturer in English. That is how I entered the teaching profession. I had no regrets of not going for more lucrative jobs because the opportunity to teach literature gave me the happy satisfaction of being able to combine my vocation and avocation. 

Robin Lindley: That sounds like the ideal career choice. What was the subject of your doctoral dissertation and what did you learn from that study?

Professor Nibir Ghosh: The topic of my Ph.D. work was “W. H. Auden: From Communism to Christianity.” My doctoral work was a very exciting experience. It gave me access to a totally new way of looking at events, ideas and personalities beyond the limited confines of what I had been hitherto reading. It introduced to me the importance of interdisciplinary perspectives. 

In connection with my work I visited on a regular basis the British Council and American Center libraries in New Delhi and had long periods of stay at the American Studies Research Center at Hyderabad. The study of Auden’s poems made me delve into the Russian Revolution, the Great Depression of 1929, the Weimar Republic in Germany and the emergence of Hitler, the Spanish Civil War culminating in World War II, Existentialism as a philosophy, Psychology from Freud and Jung to Langland, besides various nuances of Christianity, all of which seemed necessary to get the right perspective to studying the writings of Auden. 

Robin Lindley: How do you see the arc of your career from professor and author and now to chief editor of your ambitious and lively journal of the arts and culture, Re-Markings?

Professor Nibir Ghosh: I see a natural evolution in the ‘arc’ of my career from a teacher to author to the Chief Editor of Re-Markings. 

As a teacher I enjoyed interacting with my students and in motivating them to see how the narratives they read were relevant to the lives they lived. My participation in seminars and conferences on a regular basis brought me into close contact with scholars and academicians from different parts of India and abroad.

Right from the time I joined the teaching profession, I got many opportunities for publishing my work in magazines and periodicals of repute. On many occasions it had struck me that I should do something in return for all the valuable space that my writings got in prestigious publications. That is how Re-Markings was born in March 2002 as an International biannual Journal of English Letters. I felt happy to provide a forum to aspiring scholars, academics, poets and critics to express their concerns. 

It is difficult to believe how time flies as in March 2021 Re-Markings is slated to complete 20 years of its publication. As for the international outreach and the prestige the journal enjoys, you are in a better position to judge. In starting and continuing the publication from Agra, I must acknowledge the ideational and graphic support and guidance I have constantly received from its Executive editor, Sandeep Arora. 

Robin Lindley: Much of your writing and research concerns British and American literature and history. What sparked this focus?

Professor Nibir Ghosh: My writings and research in British literature began with the study of authors and works prescribed in the master’s program and continued unabated to the study of Auden and beyond. In my M.A. course we had a special paper on Modern American Literature that introduced me to writers like Emerson, Walt Whitman, Mark Twain, Robert Frost, Emily Dickinson, Tennessee Williams, Arthur Miller, Eugene O’Neil and others. 

My knowledge of American History largely evolved from a course I attended at the American Studies Research Center, Hyderabad in 1978. The one-month long course was titled “Looking for America.” The faculty comprised distinguished professors from American universities in the domain of Literature, History and Culture. One professor, John G. Cawelti from Chicago University and author of The Six Gun Mystique, became a close friend. The discussions I had with him, when he visited Agra, and later through correspondence, proved very valuable in my enhancement of the knowledge of American literature and history. 

Robin Lindley: What’s it been like for you to live and work in the romantic city of Agra, home of the Taj Mahal—a tribute to love? You rhapsodize about the remarkable city in your writing about the literary giant Rabindranath Tagore and in other work.

Professor Nibir Ghosh: Living, studying and teaching in Agra has been an enriching experience. Agra, having been the center of Mughal rule, is steeped in History. The monuments like the Taj Mahal, Red Fort, Fatehpur Sikri etc. make you feel a part of a bygone era. It is from Fatehpur Sikri that Emperor Akbar preached his philosophy of Sulaha Kul or the essential oneness of all religions. I used Rabindranath Tagore’s poem on Shajahan to say that if a poet could give eternal life to a monument made in alabaster, how greater must be his ability to give vibrancy to the nameless toiler and tiller of the land. 

Robin Lindley: In addition to your writing on literature, you have an excellent grasp of history and the context of the works you study. How do you see the role of history in your research and writing?

Professor Nibir Ghosh: I have always believed that no matter how much we talk about art for arts’ sake kind of writings, literature isolated from history and culture cannot exist on its own. An extensive study of the relationship between American history, literature and politics became the focus of my book Calculus of Power: Modern American Political Novel published 1997. In this book I have examined American literature from the perspectives of Economics, War, Women Empowerment, Race, and American Justice on Trial. While engaged in this expansive project, I made an in-depth foray into the history of the foundation and subsequent making of America into a super power. 

Robin Lindley: Your new book, Mirror from the Indus, presents a collection of your essays, tributes and memoirs. How would you like to introduce this remarkable collection to new readers?

Professor Nibir Ghosh: That’s an interesting question. The endorsements on the beautiful cover by celebrities like Ethelbert Miller, Dr. Tijan M. Salah and professor Jonah Raskin are bound to evoke great expectations in new readers. I would like to say that they will most likely find in my select writings a wide-ranging variety of themes, personalities and concerns. 

By exploring and examining the life and work of a very eclectic list of writers, poets, social reformers, spiritual giants, revolutionaries, freedom fighters, monarchs, statesmen, artists, and intellectuals, I have tried to show that compassion and sensitivity to human concerns, the ability of individuals to be the change they wish to see in the world, the courage and the grit to challenge the status quo, defending the right of individuals to exist as individuals, the ordinariness of the extraordinary pursuits of enlightened humans in the terrain of all the temporal as well as universal, are bound to keep them riveted to the collection.

Robin Lindley: The book is a gift to readers. I enjoyed especially enjoyed your introductions to several writers and scholars who were new to me. A subtitle for Mirror could be something like Writing Without Borders. 

In the introduction, you describe this anxious time during a deadly global pandemic, and conclude that section with this inspiring sentence: “Let us all come together as members of One World to fight and defeat the forces of pestilences and usher in a glorious Republic of peace, prosperity and happiness without any discrimination.” It’s obvious that transcending boundaries is important to you. How can the humanities, the arts, help do this?

Professor Nibir Ghosh: Thank you for enjoying reading through Mirror. Yes, I agree that in keeping with the contents of the book, ‘Writing Without Borders’ could very well be taken as a subtitle. 

I have always believed from my own experience of interface with people from different communities, religions, nations, cultures and the like, that innately there is in all of us a craving for a world without borders. It is only when we begin to get out of what Robert Frost calls the “Mending Wall” syndrome that real communication takes place in a spirit of easy give and take. I may cite from my own life as a case in point. I was born in Poona (now Pune), the land of the Maratha ruler Shivaji, into a Bengali household. My mother tongue is Bengali. I have lived in the Hindi heartland most of my life. My wife (to whom I have dedicated the book) is a Punjabi. I have felt hugely enriched by not restricting myself to particular climes and regions be it national or international. I have loved and enjoyed reading Mark Twain and Ernest Hemingway as much as Anton Chekhov, Albert Camus or Gabriel Garcia Marquez. If you look at the titles – “Beyond Boundaries.” “Embracing the World,” “Shaping Minds,” “Erasing Barricades,” Multicultural America” etc. – of many of the books that I have authored and edited, you may notice that harmony and oneness constitute the essence of my creative and critical endeavors. 

As an instance of my approach to overcoming prejudices and stereotypes, I would like to share an experience with a Pakistani gentleman. On my return home from the Fulbright tenure at the University of Washington, Seattle, I received a call from one Zeeshan-ul-hassan Usmani requesting me to edit a collection of essays written by Fulbrighters from India to America and from America to India. Considering the enormity of the task and constraints of time, I said no. In the next minute, Zeeshan said that could I reconsider my decision in the light of the fact that his mother hails from Agra. I had no alternative. I named the collection Beyond Boundaries. The arts and the humanities can go a long way to create bridges between cultures. In 2017, Re-Markings brought out A World Assembly of Poets as its signature Special Number, guest-edited by Dr. Tijan M. Sallah. The contributors included poets from all the five continents and over sixty countries. Even a cursory glance at the volume will convince you how only in the true Republic of Poets all demarcations separating one individual from the other can disappear.

If you look at the list of contents in Mirror from the Indus, you may notice that the figures taken into account are from various communities, religion and culture: Hindu, Brahmin, Dalit, Muslim, Sikh, Jew, Christian, Anglo-Indian, French, Canadian, British and what have you.

Robin Lindley: Your work brings light and pulses with your love of humanity and justice. Do you consider yourself a humanist?

Professor Nibir Ghosh: Yes, obviously. It is not a crime, I guess, to profess the love of humanity and justice. 

In our own era, from a pragmatic point of view, it may be gainful to avoid clichés like justice and human values because the majority always tends to remain in the mainstream and go with the flow of the current but I strongly agree with Dr. Martin Luther King, Jr.’s statement that “our life begins to lose meaning the day we become silent about things that matter.” I learnt very early from my experiences in several roles, that if one decides to fight for justice at any level one must learn to conquer both ‘temptation’ and ‘fear.’ I have always tried to portray this through my own actions and through all my writings, talks and lectures.

Robin Lindley: Your writing reflects those values. And, in your writing and well researched articles, you make me want to read and learn more, particularly from the authors and books you cite. Is it fair to call you a literary activist?

Professor Nibir Ghosh: If activists are not identified with any flag-carrying activities, I would not mind being called a literary activist. Each issue of Re-Markings in its 20-year journey has remained committed to its manifesto of highlighting broad socio-political and cultural issues of human import so as to promote harmony through healthy interactive discussions and debates. Even when I am lecturing or delivering a talk to audiences comprising the youth, I remain focused on what each of us can do in our individual capacities to reduce discrimination, disparity and prejudice that create yawning gulfs among one individual or group and another. 

Robin Lindley: Bridging gulfs between people is a noble goal in today’s world. In your tribute to Mahatma Gandhi and his relevance now, you note how he influenced the likes of President Barack Obama and Dr. Martin Luther King, Jr., who called Gandhi’s influence “inescapable.” What do you think Dr. King meant?

Professor Nibir Ghosh: Please allow me to cite the words of Barack Obama on Gandhi that I have used in my tribute to Gandhi in Mirror: “He (Gandhi) inspired Dr. Martin Luther King. . . if it hadn't been for the non-violent movement in India, you might not have seen the same non-violent movement for civil rights here in the United States. . . He was able to help people who thought they had no power realize that they had power, and then help people who had a lot of power realize that if all they're doing is oppressing people, then that's not a really good exercise of power."  

Dr. King had reiterated that Gandhi had “lived, thought, and acted, inspired by the vision of humanity evolving toward a world of peace and harmony. We may ignore him at our own risk.” 

In a world torn by conflict and violence, Gandhi’s ideals of “truth and non-violence” may seem at times quite anachronistic but there is much logic in his simple observation that “an eye for an eye would make the whole world blind.” As a politician, Gandhi may have made mistakes but as a mortal he continued to perform his experiments with truth till the very end of his life. 

Robin Lindley: You’re well acquainted with the lives of Dr. King and Mahatma Gandhi and others who have worked for social justice. To help readers understand their strategy in working for justice, how do you think nonviolent resistance now might advance the dismantling of systemic racism in the US—and perhaps quell the political and religious friction in India?

Professor Nibir Ghosh: That’s a complex question. In order to dismantle the solid structures of systemic racism in US and political and religious friction in India on the lines of Martin Luther King and Gandhi, it is necessary that leadership must spring from the youth who will be able to project and guard the interests and concerns of their respective communities without bothering about promoting their own vested self-interests. 

Mindsets cannot be changed with speeches and slogans; they can be broken only through sterling acts of self-sacrifice. Gandhi was forthright in pointing out in the “Introduction” to his Autobiography that “My experiments have not been conducted in the closet, but in the open…. My purpose is to describe experiments in the science of Satyagraha, not to say how good I am. In judging myself I shall try to be as harsh as truth, as I want others also to be.” What is relevant to caste/race applies equally to religion. 

Robin Lindley: You comment on many literary giants in your new book with sensitivity and understanding. I loved the jungle stories and other writing of Rudyard Kipling when I was young but later came to see him, as George Orwell did, as a hidebound British jingoist and imperialist and thus came to ignore his writing. You have a more thoughtful and nuanced view. How do you see Kipling’s writing?

Professor Nibir Ghosh: I do not wish to contest your dislike of Kipling and his writings but do allow me to point out that George Orwell, in the same remark that you allude to, admitted that “During five literary generations every enlightened person has despised him, and at the end of that time nine-tenths of those enlightened persons are forgotten and Kipling is in some sense still there.” 

In my view, in spite of Kipling’s jingoistic imperialism, he commands admiration of readers by his sensitive approach to human problems. For over a century now, Rudyard Kipling’s poetic utterance, “Oh, East is East, and West is West, and never the twain shall meet,” has been used time and again, both in and out of context, by all and sundry to define visible boundaries that demarcate civilizations characterized by the East and the West. Consequently, I thought of doing a bit of research to find out what led Kipling to draw such an inference.

It is indeed ironical that Kipling’s most misunderstood statement is generally used by those who have probably not read the poem at all. Through a single line, they are quick to conclude that there exists an unbridgeable gulf between the two civilizations – one supposedly ultramodern and the other gradually rising out of a relatively primitive past. Endowed by the bliss of ignorance, they tend to ignore, perhaps deliberately, the true import of Kipling’s observation that does not end with the line mentioned above but goes on to the length of a full quatrain that reaffirms human belief in synthesis and synchronicity by cutting across cultural barriers. The quatrain with which Kipling’s 1892 poem, “Ballad of East and West,” begins and ends reads thus:

Oh, East is East, and West is West, and never the twain shall meet,  Till Earth and Sky stand presently at God's great Judgment Seat;  But there is neither East nor West, Border, nor Breed, nor Birth,  When two strong men stand face to face, though they come from the ends of the earth.

In my piece on Kipling in Mirror I have shared my inference that Kipling sees the relationship between the ruler and ruled not permanently confined to master/slave binaries but one that can, through courage and daring, meet on the level ground of equality. 

In both spirit and flesh Kipling’s poetic statement ought to transform those who espouse the idea that civilizations should never mix and that cultural barriers are insurmountable. In the present era of communication and satellite revolutions it may be futile and superfluous to imagine that “mortal millions” should remain isolated and “alone” in inviolable cultural isles of their own. Also, you may have noticed from your reading of The Jungle Book how Kipling draws our attention to ways and means to deal with the environmental crises that we are now facing.

Robin Lindley: Thank you for those comments on Kipling’s still relevant words. In discussing the work of Somerset Maugham, you state: “Above all, Maugham has succeeded in demonstrating through The Moon and Sixpence that masterpieces are eternal contemporaries of mankind and have value and significance beyond the immediate confines of a particular moment in history.” How do you see “the confines of history”?

Professor Nibir Ghosh: Frankly speaking, what drew me to The Moon and Sixpence by Somerset Maugham was my deep interest in the life of Paul Gauguin. In school, I had read somewhere that when Gauguin had gone nearly bankrupt after quitting his job as a stock-broker in Paris, and his wife had scorned him saying that if his paintings couldn’t even buy some medicines and a glass of milk for their ailing son, they were really worth nothing. Gauguin had calmly accepted that, though she was right then, yet his paintings would someday adorn the Louvre Museum in Paris.

Somerset Maugham’s fictional biography reminds us that though, striving for the ‘Moon’, Paul Gauguin may have landed himself with only ‘Sixpence’ in his lifetime, but what is significant is how posterity has acknowledged his immortal creations. 

My reference to “the confines of history” suggests that immortality of an artist can never be judged by contemporary appraisal of art but must await the continuous assessment of time beyond the immediate moment in history.   

Robin Lindley: I enjoyed your tribute to the renowned English poet W. H. Auden, the subject of your dissertation. You write that Auden, though not a church-going Christian, saw the teachings of Jesus as “a strong reaction against the evil and absurdity of class and racial prejudice.” What did Auden see in the words of Jesus? 

Professor Nibir Ghosh: Thanks for your appreciation of my tribute to W. H. Auden in Mirror from the Indus. Auden’s views and his interpretations of Christianity are both descriptive and prescriptive. His prose pieces are as elaborately concerned with Christianity as his poetic outpourings. 

In numerous essays, Auden explores the theme of Christianity in its essence and tries to relate its relevance to man’s needs in contemporary society. For Auden, even a bleak post-war landscape attains significance when viewed through the perspectives of a Christian world. Though the chaotic conditions exist yet there is an undercurrent of hope that the situation is redeemable. 

Auden considers God to be “the cause and sustainer of the universe” and says that “our real desire is to be one with Him. . . Ultimately that is the purpose of all our actions.” He demands that God should be invoked to restore order and meaning to the universe: “Let us praise our Maker, with true passion extol Him/ For, united by His word, cognition and power, / System and Order, are a single glory.”

Auden affirms the value of faith and what it can achieve. He extols the idea of faith in a world devoid of spiritual values. In his personal life too, Auden was wholly devoid of self-importance or pretentiousness, and he often revealed a humility that was both deep and genuine. Kindness and generosity were traits of his individual behavior.

On the basis of faith in God, Auden is able to assess the nature of ‘Love’ in a deeper and more precise manner. It is my strong assumption that Auden believed in the solitary and silent mode of praying and not in prayer as a spiritual exercise. He criticized the sectarian spirit displayed by the churches but honestly believed in the quintessence of Christianity. Christianity, for him, stood for something more profound than the celebration of empty ceremonials. 

Robin Lindley: You’re a friend of award-winning author, professor, public intellectual, and all-around brilliant scholar and artist Charles Johnson, a University of Washington professor emeritus. You wrote a book about his work, Charles Johnson: Embracing the World, with American poet and literary activist E. Ethelbert Miller. You also worked with Professor Johnson at the UW. How did you come to work with him and how do you see his place in the pantheon of American literary figures?

Professor Nibir Ghosh: Many years ago, when the Public Affairs Section of U.S. Embassy, New Delhi, informed me that Charles Johnson—author of Middle Passage, Oxherding Tale, Dreamer etc., a MacArthur Fellow and winner of the National Book Award—was visiting India on a lecture tour, and that I was to accompany him in India, I was thrilled by the prospect of interviewing him against the backdrop of the Taj Mahal.  My enthusiasm did not last long as his visit did not ultimately materialize on account of the Iraq war. Perhaps Fate had ordained that we would meet not in Agra but at the University of Washington, Seattle. 

Initially, when I was awarded the prestigious Senior Fulbright Fellowship, my choice as the place of work was City University New York with Professor Morris Dickstein as my faculty associate. When I was given an additional option by CIE in Washington, DC., I decided to join the University of Washington as my project was on contemporary African American Writings, with Charles Johnson as my faculty associate. 

Two days after settling down at an apartment at Furman Avenue (thanks to the kind courtesy of professor Richard Dunn, HOD English), we were pleasantly surprised to see at our dwelling none but the famed Charles Johnson himself who, accompanied by his daughter Elizabeth, came to visit us. I warmly welcomed him by wrapping a shawl around him as we honor scholars in India. Guess how he reciprocated! He gave me a huge packet he had brought for us. When I untied the fancy ribbons and opened the packet, there lay in front of us over two score books—novels, essays, interviews, photo-autobiography, and so much more—all of which he had authored. His endearing inscription on each one of them made them all the more valuable. I instantly realized the extent of his magnanimity and goodness that I had hitherto seen in his correspondence. I may also mention here that Dr. Sunita’s project, as a Visiting Scholar at the School of Asian Languages, UOW under the guidance of professor Michael Shapiro, was translating Johnson’s novel Dreamer into Hindi.

My frequent long conversations with him contributed significantly to my understanding of the nuances and complexities of certain basic issues confronting contemporary America and also inspired me to engage in fruitful conversations many other celebrities within and beyond Afro-America. 

We were truly privileged to be introduced by Charles to August Wilson who invited us to dinner at the Broadway Grill. The animated exchanges that I had with authors like August Wilson, David Guterson, Octavia Butler, Jonah Raskin, Ethelbert Miller, Kathleen Alcala and others besides Charles Johnson, flowered into a precious collection titled Multicultural America: Conversations with Contemporary Authors (2005). 

Before meeting Charles Johnson, I was very much familiar with the works of Richard Wright, Ralph Ellison and James Baldwin and many other African American writers, poets, philosophers and critics. In my view Johnson has created an enviable niche for himself in the pantheon of African American writings.

 

Robin Lindley: How would you describe Professor Johnson’s style and voice as a writer of fiction and nonfiction?

Professor Nibir Ghosh: As you may be aware, Johnson’s work, especially his fictional output, is firmly grounded in Philosophy. I truly admire his non-fiction where his voice is most pronounced and impactful. His Buddhist leanings have not only added to the glory of his writings but also contributed a great deal to his abiding generosity and compassion that one can instantly recognize on meeting and talking to him.

I had interviewed him for my book, Multicultural America: Conversations with Contemporary Authors and also for Re-Markings. It is very significant that hinting at the danger of living in a parochial cultural fishbowl, he lyrically resonates the need for a completely new outlook that makes some narrow race-centered complaints irrelevant in an increasingly complex multicultural and global economy. He not only loves to address the symptoms of change in terms of acute identity crisis but also tries to prepare the aesthetic ground for such a change. Our mutual bonds of friendship brought him to Agra where I enjoyed his and Sharyn’s loving company with the Taj as a backdrop in February 2018. 

Robin Lindley: You’re a sensitive reader with innovative views of the literature you consider. I was struck by an essay you wrote on Joseph Heller’s classic satirical and painful war novel, Catch-22. You mentioned Wilfred Owens’ famous words on “the pity of war.” How did you come to write about Heller’s book? Are there other works on war you’d suggest for readers?

Professor Nibir Ghosh: As I have mentioned earlier in this conversation, a chapter of my book Calculus of Power: Modern American Political Novel is titled “In the Theatre of War” where I have taken up for discussion four war novels: For Whom the Bell Tolls by Ernest Hemingway, Catch-22 by Joseph Heller, The Armies of the Night by Norman Mailer and Slaughterhouse-Five by Kurt Vonnegut, Jr. 

Heller’s novel Catch-22 has always fascinated me for its unique approach to war and all that it involves. The central problem before the novel’s protagonist is to find means and devise a strategy to survive in the hostile bureaucratic system. It is through Yossarian’s inner conflict mainly that one gets a fairly good idea of what it means to be trapped in such a system. Heller exposes the hypocrisy of the bureaucratic enterprise based on the purely vested interests of those who are at the top of the hierarchy and who want the war to go on irrespective of the need for a motive. He is decidedly against the capricious self-seekers who are either making money or having fun at the expense of performing heroic deeds in order to win honor and worship for he feels he can easily be replaced by any of the ‘ten million people in uniform.’ 

Unlike Fortinbras (in Shakespeare’s Hamlet) who was prepared to risk the lives of twenty thousand men for an egg shell, Yossarian has only one passion: to stay alive and fight those in power who were about to get him. He lives in perpetual dread of everything he could possibly imagine.

In a carnivalesque spirit Heller exposes the hypocrisy of the military bureaucracy without undermining, of course, the military strength and superiority of the United States of America. Through the use of unconventional mode of aesthetic expression, blending pungent humour with the horrifying spectacle of war, Heller succeeds in conveying that the conventional heroics associated with war are no longer tenable in the modern era. 

Robin Lindley: I appreciated the introduction in your new book to the work of Dalit poet Namdeo Dhasal. Decades ago, we were taught in my public school that the Indian caste system was extremely rigid and that Untouchables or Dalits were outcasts doomed to lives of drudgery and brutal discrimination without hope of social mobility. What is the reality of the caste system now and the situation of Dalits today?

Professor Nibir Ghosh: What you were taught decades ago about Indian caste system being extremely rigid has been in resonance with ground reality even in contemporary times. 

As the ambivalence of the “American Dilemma” continues to haunt the conscience of the most powerful democracy in the world, the USA, no less problematic is the issue of Caste for the world’s largest democracy, India. During elections it can be seen how important a role caste plays in determining the suitability of a contestant fielded by any political party.

According to many noted Dalit writers, it is true that oppression and humiliation of the Dalits have not ceased. They exist still in subtler variations in many segments of society and polity despite sweeping changes in legislations and legal sanctions.

I have specifically mentioned in my essay on Namdeo Dhasal in Mirror from the Indus that. though India can take pride in upholding its democratic credentials by installing two Dalit Presidents in the Rashtrapati Bhavan and electing a Dalit woman chief minister four times in the largest state in India besides numerous ministers to the union and state cabinets, it cannot be denied that Dr. B. R. Ambedkar’s dream of liberty, equality and fraternity continues to elude the Dalit community in India.

My view is that the Dalits in India and the African Americans in the US who come from poor economic backgrounds must be made to understand the importance of upward mobility through education and work skills despite all the challenges that may threaten such initiatives. Also, the ones who have reached the higher echelons of power through affirmative action/reservation must take the initiative to encourage their less fortunate brethren to rise and shine in a grossly unequal world. 

A large measure of hope for the Dalits lies in the fact that they are getting increasingly articulate in projecting their rights and responsibilities through their writings in print and social media.

Robin Lindley: When Dr. King visited India in 1959, a school principal referred to him as an American “Untouchable.” King was stunned but, on reflection, agreed with that assessment. A big question, but from what you know of America and our history, is the view of Black people in the US comparable how Dalits or “Untouchables” are seen and treated in India?

Professor Nibir Ghosh: Dr. King may have been surprised to be seen as a “Black Untouchable” in 1959 because he may not have been aware of the fact that Dr. B. R. Ambedkar, the Dalit icon, had first brought to light the similarities between the predicament of the African Americans in the US and the Dalits in India in terms of oppression, discrimination and inequality. 

W. E. B. Dubois had written a letter to Dr. Ambedkar lauding his leadership in the Dalit cause. Dr. Ambedkar had inspired and encouraged several Dalit scholars to go to the U.S. to study African American literature and to interact with activists in the field. African American literature, consequently, served as a model for Dalits in India who wanted to give expression to their suffering and agony on account of centuries of exploitation and discrimination. Time and again, Dr. Ambedkar pointed out to his devout followers that they could learn from their African American counterparts how to articulate their emotions with boldness and daring. Using the activist model provided by the Black Panther movement, the Dalit Panther movement was created in Maharashtra.

There are close parallels where race in the US and caste in India are concerned though some, like Lama Rangdrol, may argue that the Dalits live in greater misery than the average black in America. 

Though atrocities against Dalits continue to be seen in India, it cannot be denied that changes in attitude are also visible in Dalit writings. New ways of thinking, the outlook of the new generation, scientific and technological advancement, the IT revolution etc. have affected a paradigm shift in peoples’ consciousness. 

The discriminatory modes too have undergone changes. The social media and the internet provide the opportunity to connect with everyone on earth without the prejudice of caste, creed, color, religion or nationality. 

 

Robin Lindley: Like me, many readers may be puzzled by the ongoing religious tensions and eruptions of violence on the south Asian subcontinent. Did the tensions today originate with the partition and independence in 1947, or was there always violence between the two primary religions, Hindu and Muslim? This topic is worthy of many books, but what’s your sense?

Professor Nibir Ghosh: It would not be correct to conclude that religious tensions and eruptions of violence between the Hindu and Muslim communities in India originated as a result of the partition of the nation in 1947. Of course, the partition drew a permanent wedge in the two communities and those who had lived in peace and harmony for ages turned foes overnight and participated in orgies of violence that remain unparalleled in the history of the sub-continent. 

In my opinion the Hindu-Muslim discord is a legacy of the divide-and-rule policy of the British Government. The First War of Independence (which the British designate as a mutiny), that took place in 1857 and literally shook the citadel of English rule in India, was fought with the Hindus joining hands with the Muslims to drive away the British. Consequently, after the failure of the combined forces, the British power realized that in order to consolidate their Empire, it was necessary to pit one community against the other. In fact, the English succeeded in their sinister design by creating pressure groups who advocated the partition of the country. It is, however, relevant to note that the Indian National Army (INA) under the leadership of the revolutionary leader Subhas Chandra Bose offers a unique example of Hindu-Muslim amity and brotherhood. 

Even today, the legacy of creating communal discord under the divide-and-rule policy seems to be a convenient tool in the hands of politicians to sustain their political existence. 

Robin Lindley: Our current president Donald Trump and India’s Prime Minister Modi are seen by some commentators as similar in that they both use fear and division to appeal to their political bases. Our countries are very different, but do you agree with that view of the two leaders? How do you see them? 

Professor Nibir Ghosh: History bears evidence to the fact that be it democracy or dictatorship, the leaders do resort to the use of fear and division to keep themselves in power. The strategy of the two leaders you mention may be quite similar when it comes to consolidating their respective political bases. But what makes Modi different is that he enjoys the admiration of people from the lower economic strata on account of his ability to connect with them on one-to-one basis through his emotional speeches and seemingly genuine concern.

Robin Lindley: Indian writing in English is gaining popularity in the United States. Who are a few Indian writers you’d recommend to American readers?

Professor Nibir Ghosh: Since most American readers are already aware of the much-hyped works of Booker and Pulitzer Prize recipients who are immigrant US citizens, I would recommend writers like R. K. Narayan, Mulk Raj Anand, Raja Rao, U. R. Ananthamurthy, Mahashweta Devi, Munshi Premchand, Nissim Ezekiel, Jayanta Mahapatra among numerous others.

Robin Lindley: You thoughtfully consider this era of the COVID 19 pandemic in the introduction to Mirror from the Indus and in your recent blog entries. The United States now leads the world in COVID cases and deaths. What is the situation in India with the pandemic?

Professor Nibir Ghosh: India is closely following on the heels of the United States in the domain of rising COVID 19 pandemic cases. Population density is a major cause for worry in India. Poverty, unemployment, lack of health care and infrastructure facilities add to the challenge. In fact, the onus of protection from the Corona virus largely rests on individuals in terms of social distancing and sanitization. Ayurvedic medicine and herbs seem to provide some hope for increasing immunity to check the effect of the virus.

Robin Lindley: You offer many encouraging and wise words at this time of peril for the entire globe. Where do you find hope at this challenging time?

Professor Nibir Ghosh: I have elaborately stated in the Preface to Mirror from the Indus that what we need most in this time of peril is to heed the voices of philosophers, poet-prophets, writers and intellectuals who have warned us time and again to bring in a revolutionary change in our attitude and approach to halt our onerous march toward doom. 

Like mindless robots we have often refused to listen to the voices of sanity. In 1762, at the very beginning of The Social Contract Jean-Jacques Rousseau had asserted that “man is born free and everywhere he is in chains” and had suggested that the only way we could break the fetters was to “return to nature.” Following Rousseau, William Wordsworth warned us to refrain from entering the whirlpool of the endless cycle of getting and spending. Rather than enter into a “Social Contract” to breach the unsurmountable gulf between affluence and poverty, mankind moved on, unmindful of impending catastrophes, presuming that the powerful, the wealthy and the affluent would always remain untouched by such storms of adversity. 

We are bound to feel pessimistic when we are reminded about the recent happening in Minneapolis where four white policemen attempted, in the manner of the deadly virus, to create respiratory problems leading to the death of George Floyd, a black American. The event clearly demonstrates the human resolve to continue with the status quo of the powerful asserting their dominance over the oppressed and powerless wings of society.

However, it can certainly be hoped now that the day is not too far away when one could assuage the accumulated guilt of centuries by inculcating the feelings of compassion and universal brotherhood toward the downtrodden and helpless masses. We must learn to accept the paradigm shift from the emphasis on integration and inter-connectivity of a globalized world to the new norms of social distancing, isolation and quarantine. COVID 19 has come with numerous lessons for mankind, the most prominent being the need for compassion, fellow-feeling of love and brotherhood for one and all. 

If we join our hands and hearts in this hour of grave global crisis, curb our own immediate self-interests, and work in communion for a society where individual happiness can coexist in harmony with the general good of all, there is enough room for hope and optimism.

Robin Lindley: Thank you Professor Ghosh for your illuminating comments and congratulations on your compelling new book Mirror of the Indus. It’s sure to be a resource for many years to come. And, as renowned American poet and a past University of Washington professor Theodore Roethke said, “In a dark time, the eye begins to see.” At this anxious time, I find your words and your writing reassuring, and I know other readers too will appreciate the light you cast at this dark time. 

Professor Nibir Ghosh: Thanks Robin for your deep interest in my work. I thoroughly enjoyed this conversation. I shall be happy if the light of my book illumines even a little corner of a heart in despair. 

 

Robin Lindley is a Seattle-based writer and attorney. He is features editor for the History News Network (hnn.us), and his work also has appeared in Re-Markings, Writer’s Chronicle, Crosscut, Documentary, Huffington Post, Bill Moyers.comSalon.com, NW Lawyer, ABA Journal, Real Change, and more. He has a special interest in the history of human rights, conflict, medicine, and art. He can be reached by email: robinlindley@gmail.com.

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/blog/154408 https://historynewsnetwork.org/blog/154408 0
Trump’s Comments on History Point Down a Stalinist Road

 

 

What President Trump said on September 17 at a White House Conference on American History was in keeping with his people-beyond-reproach view of our past. It also reminds me of Stalin's approach to Russian history. Like Stalin, Trump is trying to tell us to teach only his narrow version of “patriotic” history.

The president’s remarks (delivered in the National Archives Rotunda, the home of the Constitution) begin this way: “Our mission is to defend the legacy of America’s founding, the virtue of America’s heroes, and the nobility of the American character. . . . We want our sons and daughters to know that they are the citizens of the most exceptional nation in the history of the world.” He went on to specifically criticize the New York Times’ 1619 Project: “This project rewrites American history to teach our children that we were founded on the principle of oppression, not freedom.”

Historians have both praised and criticized this New York Times project, but as an article reposted on HNN indicated, Trump is “threatening to censor the way schools teach about the history of slavery and racism.” In emphasizing that history should teach American exceptionalism, “the virtue of America’s heroes, and the nobility of the American character,” and criticizing a project that stresses the centrality of slavery and racism to our history, Trump is starting down a Stalinist path.

During the 1930s, in the face of threats from Nazi Germany and an increasingly militant Japan, Stalin directed historians and schools to teach patriotism to help convince Soviet citizens that, like earlier Russian leaders such as Alexander Nevsky, Ivan the Terrible, and Peter the Great, he was defending Russian interests. In the 1920s, communist historians had generally been critical of such pre-Soviet rulers and indeed of much of tsarist Russia. But in the 1930s Stalin realized that history and patriotism could be more effective than appeals to communist ideas in uniting the country behind him. Two significant Sergei Eisenstein films of the period 1938-1944, both overseen by Stalin, Alexander Nevsky (1938) and Ivan the Terrible. Part 1 (1944), give some idea of how Stalin now wished earlier Russia’s past heroes to be portrayed. The first film ends with Nevsky’s just uttered words being shown in big letters on the screen: “He who comes to us with a sword, by the sword shall perish. On that our Russian land takes and will forever take its stand.” 

 

With Ivan the Terrible, who ruled in the sixteenth century, Stalin felt a special affinity. Robert Tucker’s Stalin in Powertells us that Stalin said Ivan was a “great and wise ruler” and that he was “a role model” for Stalin’s “Great Purge” of the late 1930s.

 

Stalin’s dissatisfaction with Soviet historians’ depiction of the Russian past became evident in 1934 when he invited a group of them to the Kremlin, where he told them of his displeasure. Shortly thereafter he signed a decree mandating that new school history texts be prepared, and he later reviewed prospectuses for such texts. When a new text was approved and appeared in 1937, Tucker relates that Ivan the Terrible was depicted as “a hero of Russian history” and this view of Ivan now became “a part of the elementary education of every Soviet schoolchild.”

 

In 1938, Stalin went beyond telling others what history to write: he edited and partly wrote history himself in History of the All-Union Communist Party: Short Course. In his Soviet Civilization, Andrei Sinyavsky wrote that it “became required reading for any literate Soviet citizen . . . [and] the bible of Stalinism.” Before Stalin’s death in 1953, almost 43 million copies had appeared in 67 languages. Regarding the same Soviet period and the rewriting of history, historian Robert Conquest stated, “Accompanying falsifications took place, and on a barely credible scale, in every sphere. Real facts, real statistics, disappeared into the realm of fantasy. History . . . was rewritten. Unpersons disappeared from the official record. A new past, as well as new present, was imposed on the captive minds of the Soviet population.”

 

Both Stalin and Trump built upon the work of forerunners who approached history from an ideological position and attempted to buttress it with their version of the past. In the USSR it was the historians who adhered to Communist guidelines as determined by Lenin and Stalin. In the United States a “powerful tradition of educational conservatism has had a decisive role in shaping schools and culture” and influenced the teaching of history.

 

In affecting the writing and teaching of history, Trump, of course, has not gone as far as Stalin. As the title of this essay indicates, he has just “started down a Stalinist road.” How much farther he might go will depend largely on American voters. As I pointed out in an earlier HNN essay, in addition to the similarities of Trump and Stalin, there are many differences, including their operating in very different political cultures.   

One difference not mentioned in the earlier essay was the two men’s interest and knowledge of history. Although I did mention Trump’s “deeply disturbing” lack of historical knowledge in a later essay, it did not mention Stalin. According to multiple sources, Stalin was both interested and knowledgeable about his country’s history.  Tucker, for example, has written that “he was prone . . . to see situations in terms of historical parallels,” that he was a “voracious reader,” and that he displayed a “keen interest” in Russian history. Contrast this with Trump, who seems to possess a “lethal aversion to reading.”

Thus Stalin’s problem was not (à la Trump) a lack of historical interest or knowledge, but his egoism and desire to skew history for his own sake. Unfortunately, though Trump knows little real history, he shares these traits. 

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177529 https://historynewsnetwork.org/article/177529 0
Donald Trump's "1776 Commission" for "Patriotic Education"

 

 

On September 17, Donald Trump called for a commission to require the teaching of history to encourage patriotism and reject critical interpretations of the nation's past. Click here to view HNN's coverage of the topic, both original essays and reposted content from around the web. 

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177531 https://historynewsnetwork.org/article/177531 0
Was Charles Lindbergh a Man Who Got Away?

Charles Lindbergh testifies at the murder trial of Bruno Richard Hauptmann.

 

 

 

The 1932 Lindbergh kidnapping appears on every expert’s list of 20th century American cases called “the trial of the century.”  Yet investigators disagree to this day on the answer to a basic question: was the man tried and executed for the crime guilty, or was he framed?  In the frenzied atmosphere surrounding the prosecution of Bruno Richard Hauptmann, he never got a fair trial. But was he innocent? His widow insisted so in her six-decade battle to clear his name. Most authors who have revisited the case say no. I say think again.  

 

Complicating every effort since March of 1932 to unravel what happened was the level of fake news that blanketed the case from day one. Even though the crime happened nearly 90 years ago, many facts have only recently been uncovered. My new book, The Lindbergh Kidnapping: Suspect No. 1, The Man Who Got Away,takes advantage of the distance in time to treat the boy’s father as a potential suspect in his son’s kidnap and murder. Why did Lindbergh get treated with kid gloves during the investigation of a crime in his own home – even after Scotland Yard suggested the parents should be investigated? 

 

Most older Americans recall learning the singular achievement of the man born nearly 120 years ago. One biographer called him America’s “last hero.”  For those in younger generations who did not get taught about Charles Lindbergh, he vaulted to sudden fame for completing the first nonstop intercontinental flight from New York to Paris in 1927. It ushered in a new era of global connectedness -- prompting many sites across the country to be renamed in his honor. History buffs have likely read one of several popular biographies of Lindbergh. True crime fans of the History Channel recognize Lindbergh as the father of the 20-month-old victim of the mysterious “crime of the century”  that riveted the entire nation in the 1930s.  Over time, a less flattering portrait of America’s hero has emerged. The reclusive villain in the PIXAR cartoon movie “UP!” was modeled in large part on the aviator. Many viewers watched the recent HBO miniseries based on Philip Roth’s novel The Plot Against America — which rewrote history to reveal Lindbergh as a Nazi-sympathizer whose America First campaign defeated FDR in 1940 to keep the United States neutral in World War II. 

 

Though his legacy is now mixed, none of us today can really appreciate Charles Lindbergh’s heroic image in the late 20s through the mid-30s. He displayed the courage of pilot Sully Sullenberger -- at half Sully’s age -- with the charisma and good looks of a young Brad Pitt and the persistent media attention accorded Donald Trump. Since 1927, millions of Americans saw their hero featured week after week in headlines, radio reports, and newsreels. Followers devoured news of his marriage and the birth of his namesake -- like fans today avidly consume developments in the lives of Prince Harry and Meghan Markle.  In the midst of a national Depression, pride in Lindbergh lifted people’s spirits. Suddenly, came news his toddler son had been snatched from his nursery and held for ransom – at a time when gangs kidnapping children of the rich and famous had averaged more than two a day since 1929. Only this time the victim was a toddler revered as the crown jewel of the nation – the only son of America’s royal family.  The kidnap of Charles Lindbergh Jr. shocked and enraged Americans to their core.

 

After participating as a member of O.J. Simpson’s “Dream Team” of defense lawyers, Professor Gerald Uelmen wrote Lessons from the Trial: The People v. O. J. Simpson. In that book, he observed: “The most remarkable aspect of every ‘trial of the century’ . . . has been the insight it provides into the tenor of the times in which it occurred. It is as though each of these trials was responding to some public appetite or civic need of the era in which it took place.” The Depression-era Lindbergh kidnap/murder took place amid a sharp rise in xenophobia in a national political environment dominated by white supremacists and social Darwinists, who feared the degradation of their race by an influx of immigrants. All of these factors figured in how that case played out.

 

When police investigate various suspects, they generally consider motive, opportunity and means as well as later conduct demonstrating consciousness of guilt. Yet in the Lindbergh kidnapping case, the police only applied those criteria selectively and completely ignored one insider. After Hauptmann was tried and executed, questions still lingered. Investigators also noted Lindbergh’s odd behavior in the wake of the crime -– conduct which either intentionally or negligently obstructed the police investigation.

 

Near the end of World War II, British military historian B. H. Liddell Hart invited his readers to open their minds to face facts that might be disquieting: “Nothing has aided the persistence of falsehood, and the evils that result from it, more than the unwillingness of good people to admit the truth when it was disturbing to their comfortable assurance.” A key suspect the police focused on the day after the abduction was labeled by the FBI “UNKNOWN PERSON NO. 1 (Man with Ladder Near Lindbergh Home).” For shorthand, I call him “Suspect No. 1” -– a slim figure in a long stylish coat and fedora glimpsed at dusk with a ladder in his car at the entrance to the Lindberghs’ driveway earlier the same evening as the kidnapping. What impact did it have on the investigation to allow Lindbergh full authority to direct the state police investigating that crime?

 

Today, we have both the benefit of insights provided by previous scholars and sleuths, as well as a treasure trove of evidentiary puzzle pieces whose significance had been previously overlooked. I invite readers to focus on a key question police never pursued back in the spring of 1932-–was international hero Charles Lindbergh himself Suspect No. 1, the man who got away? And then judge for themselves.

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177533 https://historynewsnetwork.org/article/177533 0
The Roundup Top Ten for September 25, 2020

What Trump is Missing About American History

by Leslie M. Harris and Karin Wulf

"Journalists and politicians are examples of two groups that are differently but equally susceptible to a desire for clarity and simplicity about the historical past. But the past is rarely clear and was never simple."

 

Are We Ready to Rehabilitate George W. Bush’s Reputation?

by Andrew R. Graybill

The presidency of Donald Trump has allowed supporters of George W. Bush to push for a reevaluation of a man who left office with historically high unfavorability ratings. A SMU professor digs into recent books by way of evaluating whether Dubya will get a raw deal from history.

 

 

Ruth Bader Ginsburg Made The Impossible Look Easy

by Serena Mayeri

Ruth Bader Ginsburg's achievements were remarkable, but a professor of law and legal history argues that her determination to open paths for others to follow her was greater. 

 

 

Is Academe Awash in Liberal Bias? Most People Think So. They're Wrong

by Naomi Oreskes and Charlie Tyson

Available data do not support claims that university professors are extremely leftist, that a majority of students are being educated by left-wing professors, or that academe is biased against conservatives. So why do so many people believe these claims? Methodologically flawed studies and a long-running culture war.

 

 

Donald Trump’s Bizarre History Conference

by Ron Radosh

"There is good history and bad history, and either can be written by historians on the left or on the right. There is no such thing as left-wing history or right-wing history. There is only historical research and the conclusions drawn from evidence."

 

 

Is Freedom White?

by Jefferson Cowie

In American mythology, there exists a gauzy past when white citizens were left alone to do as they pleased with their land and their labor (even if it was land stolen and labor enslaved). In the legend, those days of freedom and equality were, and still are, perpetually under assault. 

 

 

Trump’s Vision for American History Education Is a Nightmare

by L.D. Burnett

"As a historian who writes about the field of history’s place in the culture wars of the 1980s, I watched this conference and saw one long exercise in logrolling for the participants’ politically intertwined institutional commitments."

 

 

The Endless Fantasy of American Power

by Andrew Bacevich

Neither Trump nor Biden seems prepared to do the necessary work of moving military power and force from the center of American foreign policy. The consequence will be further endless war at the expense of the global-scale policies needed to confront the most urgent threats.

 

 

The Militia Menace

by Tom Mockaitis

The time has come to stop mincing words about militias and other far-right extremist groups. They are at best-armed vigilantes and at worst domestic terrorists acting on behalf of a racist ideology.

 

 

Scapegoating Antifa for Starting Wildfires Distracts from the Real Causes

by Steven C. Beda

The idea of left-wing radicals starting wildfires in the Pacific Northwest dates back to timber companies blaming the Industrial Workers of the World for blazes as a way to discredit demands for workers' power through unions. 

 

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177527 https://historynewsnetwork.org/article/177527 0
Trump's "Patriotic Education" Commission Yet Another Battle Over the Meaning of Those Words

 

 

On Thursday, President Trump announced that he would be creating a “patriotic education” initiative called the “1776 Commission” that will develop a “pro-American curriculum” for the nation’s schools. Attacking the New York Times’ Pulitzer-winning 1619 Project and other anti-racist educational frameworks as “toxic propaganda,” a “crusade against American history,” and “a form of child abuse,” Trump claimed that “patriotic moms and dads are going to demand that their children are no longer fed hateful lies about this country.” 

Trump’s remarks are part of an ongoing conservative response to the 1619 Project, one that has frequently emphasized 1776 instead—as, for example, did the Woodson Center’s 1776 Project, which creator Bob Woodson described as a “challenge to those who assert America is forever defined by past failures.” Woodson, Trump, and all those voices advocating for 1776 consistently frame their arguments as exemplary patriotism, and concurrently seek to categorize the views of the 1619 Project and its advocates as not just overly critical of America, but also as fundamentally unpatriotic.

In so doing, Trump and company are expressing a familiar combination of celebratory and mythic patriotisms, two of the four forms of patriotism I define in my forthcoming book Of Thee I Sing: The Contested History of American Patriotism (January 2021). That combination is in many ways our collective default, what we generally mean when we talk about patriotism. But if we recognize that there are other longstanding and equally valid forms and expressions, we can better understand both the contested history of American patriotism and the legacies of that battle in our 21st century moment.

The most familiar form of patriotism is what I call celebratory, the form embodied in shared communal rituals: the singing of the national anthem with hat in hand and hand on heart; the recitation of the Pledge of Allegiance by schoolchildren at the start of each day; the closing of speeches with “God bless the United States of America.” Out of such everyday rituals, as Michael Billig argues in his influential book Banal Nationalism (1995), a sense of national belonging and community is constructed. Those rituals and that community are at least potentially inclusive, able to be shared by all Americans.

But in practice, as we see today with Trump and company, American celebratory patriotism has often been wedded to a second and far more divisive form: exclusionary mythologizing (or mythic for short) patriotism. This form relies on mythic visions of both patriotism and the past in order to exclude a number of Americans, in two interconnected ways: excluding those voices that critique America’s flaws, since they are not sufficiently celebratory; and excluding those communities and stories that do not fit into the idealized narrative of history on which mythic patriotism depends.

The first exclusion is the more familiar element of mythic patriotism, captured in longstanding phrases like “love it or leave it” and in narratives that define criticism of the nation as fundamentally unpatriotic and anti-American. Such attitudes are exemplified by the 1918 Sedition Act, which made illegal “to willfully utter, print, write, or publish any disloyal, profane, scurrilous, or abusive language about the form of government of the United States ... or the flag of the United States.” In his recent call for Black Lives Matter protesters to be charged with sedition, Attorney General William Barr has overtly sought to return to this extreme version of mythic patriotism’s exclusions.

 Mythic patriotism’s second form of exclusion, the exclusion of histories that do not sufficiently align with the celebratory vision, is just as divisive, as it focuses definitions of both patriotism and America on particular, mythologized communities. Whether the 1776 Project’s emphasis on Founding Fathers who embodied American ideals or the 1620 Project’s narrative of “the traditions of liberty” that all Americans have “inherited from Plymouth,” this mythic vision of American patriotism and identity both flattens the complexities of the histories on which it focuses and elides the stories of American communities such as enslaved African Americans during the Revolution and indigenous cultures in (and after) the era of European settlement.

In response to such exclusions I would highlight two other, equally foundational and longstanding forms of American patriotism that directly challenge those celebratory myths and offer alternative visions of our histories and identity. There’s active patriotism, a form which defines individual and collective actions like service, protest, and sacrifice, rather than passive participation in established rituals, as the space for expressions of patriotism. An active patriotic emphasis would highlight as exemplary American patriotic communities like the United States Colored Troops during the Civil War and the suffragist activists known as the Silent Sentinels who protested outside the White House for the two years before the ratification of the 19thAmendment. 

Through their critique of the exclusion of women from the political process the Silent Sentinels also illustrated my fourth form of American patriotism: critical patriotism, which recognizes where and how the nation has fallen short of its ideals and seeks to push us closer to them. No single quote better captures this critical patriotic perspective than does James Baldwin’s from his essay collection Notes of a Native Son (1955): “I love America more than any other country in the world, and, exactly for this reason, I insist on the right to criticize her perpetually.” 

Perhaps no historical moment captures critical patriotism more succinctly than does Francis Bellamy’s September 1892 creation of the Pledge of Allegiance. Bellamy, a Christian Socialist minister whose activism against inequality and racism led him to leave churches in both Boston and Florida, intended the Pledge to be aspirational, a pledge to work toward the nation’s ideals. As he later reflected on the Pledge’s concluding lines, “Just here arose the temptation of the historic slogan of the French Revolution which meant so much to Jefferson and his friends, ‘Liberty, equality, fraternity.’ No, that would be too fanciful, too many thousands of years off in realization. But we as a nation do stand square on the doctrine of liberty and justice for all.” 

Our current moment features striking moments and examples of critical patriotism as well. There’s Colin Kaepernick, who said of his initial 2016 anthem protest, “when there’s significant change and I feel like that flag represents what it’s supposed to represent in this country, I’ll stand.” And there’s the 1619 Project, which highlights not just the horrors of slavery and racism, but also and especially the resistance and activism, the exemplary critical patriotism, of African Americans. As Nikole Hannah-Jones puts it in her introductory essay, “It is we who have been the perfecters of this democracy.”

The celebratory and mythic patriotism of Trump’s “patriotic education” proposal might seek to exclude the 1619 Project from our national and historical narratives. But in truth, this debate marks the latest stage in the contested history of American patriotism—a history that needs a far more prominent place in our collective memories and current conversations alike.  

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177405 https://historynewsnetwork.org/article/177405 0
The Second Amendment has Never Covered Kenosha Shooter Kyle Rittenhouse

 

 

 

Kenosha shooter Kyle Rittenhouse’s lawyer made headlines recently when he claimed that his client’s gun possession fell under the “well regulated militia” clause of the Second Amendment. The claim was novel, at best; the Supreme Court rulings striking down some gun control laws have made a point to not strike down the very idea of gun control. Even for those eager to see both some level of sanity in US gun laws and some measure of justice for the men killed in Kenosha, the proposed defense was enough to raise questions about the lawyer who made them – John Pierce, whose past clients include Rudy Giuliani – and about the competence of Rittenhouse’s representation. Experts in modern case law have already made clear that Mr. Pierce’s claims about a member of the militia having the right to carry any weapon anywhere lack any legal validity. 

 

Those claims also lack historical merit, and that they could get as far as they have is one more sign of the general ignorance in today’s America about what the militia was when the Bill of Rights was written and, therefore, what the Second Amendment was supposed to accomplish. So first things first: Rittenhouse was not part of the militia. Not in the way that the Second Amendment intended, in any case, nor by any eighteenth-century definition of “militia.” If anyone in Kenosha could claim to be part of the militia, it was the National Guardsmen there (more on that below). 

 

The eighteenth-century militia was an official institution which answered to the state government and, before that, to the colonial governments. To the extent that the militia ever was “well regulated,” one could not simply declare oneself a member; nor, for that matter, could the men whom Rittenhouse shot declare that they were not members. The militia as it existed in the eighteenth century was the backbone of a system of citizen military institutions that died a long time ago. There were scores of old militia laws, from both the colonial era and from the early state constitutions. Those regulations made a few things clear – which, though common knowledge at the time, seem unknown by most Americans today, including those who insist on their status as “Second Amendment supporters.” 

 

In the eighteenth century, militia membership and participation was not a choice. The men in it were legally required to participate. To put this in more concrete terms, the colonial regulations would state that all men “16 to 60,” or “above seventeen years and under sixty,” or “21 to 60,” etc., would be members of the militia. There were often qualifiers to this, such as “all able-bodied men,” or “all free male persons.” Most of the adult men not in it were prohibited from participating, or even from owning weapons. As one might expect, this division was racial – white men were required to be part of the militia and non-whites were either excluded or limited to specific positions like drummer or scout. There was some variation with these lines; poor white men were in most but not all of the militias, and Catholics were occasionally excluded, but as a general rule of thumb white men served in the militia, women and men of color did not. This was no accident, nor was it just symbolic – with little to no professional police or military forces, the militia was how whites maintained their domination. The militia was also how whites in slave states prevented enslaved people from rising up, and it was how communities along the frontier were able to take land away from Native American tribes. Rather than relying on full-time professionals, the citizens – a status limited at the time to white men – participated part-time in their local units, in which anyone eligible was required to register, and the officers in that unit kept registers of those men. The militia also trained together several times a year. Men who were required to participate but who chose not to were subject to fines and other penalties.  

 

It is fair to note, though, while that militia was a key institution in Colonial society and in the first decades of the republic, it never was never as well regulated as its advocates hoped. The regulations themselves often mentioned the sorry state into which the militia had fallen. Virginia’s 1755 militia act, for instance, noted that the previous acts "hath proved very ineffectual,” and as a result “the colony is deprived of its proper defence in time of danger." So in Pierce’s defense – although this is hardly sufficient – the militia rarely lived up to its goals, or acted as its regulators expected it to. It was also not unknown for the men who made up the militia to reject their government and rise up in arms against it, and even claim to be the militia while doing so. Shays’s Rebellion and the Whiskey Rebellion are the two most famous examples of this. When that happened, there was relatively little that a governor could do. And while those rebellions were always put down eventually, the punishments on the men who participated were fairly mild (especially if compared to the punishments meted out against enslaved people who participated in any sort of rebellion). There were also occasions like the 1782 Gnadenhutten Massacre, when local militia men took it upon themselves to execute unarmed Native Americans who were not at war with the US, leading to widespread condemnation, but not to any criminal charges against the militiamen. Still, the constitution’s militia clause made it clear that the role of the militia was to suppress insurrections, not to participate in them. 

 

The militia, then – that “palladium of liberty” – was a messy and unstable institution whose members often resented having to participate, and which at times rose up and left the government in desperate straits. It was also only really effective at maintaining racial divisions and inflicting violence on non-whites. So it’s not surprising that as the United States grew, the militia became less popular. During the first years of the Republic, there were several plans to revitalize the militia, all of which died on vine – both under Washington, who was skeptical of the militia’s potential on the battlefield, and under Jefferson, who was more enthusiastic about its possibilities. Over the course of the nineteenth century professional forces began to take over many of the militia’s duties, both as an internal police force and as an external army. The Militia Act of 1903 provided a long overdue acknowledgment that the militia of the founders’ generation was dead; in its place came the “organized militia,” which consisted of “the regularly enlisted, organized, and uniformed active militia” – henceforth to be known as the National Guard (as it was already known in much of the US). The “unorganized militia,” meanwhile, consists of all remaining “able-bodied males” aged seventeen to forty-five – a fact unknown by most of the members of that unorganized militia, as it brings with it no obligation to train, register, or do anything else whatsoever. 

 

Replacing the citizens’ militia with paid professionals has hardly been a perfect solution, as the Black Lives Matter movement has shown (following in the footsteps of a long tradition of civil rights’ groups criticisms of police violence and racism). The uneasy overlap between professional police and racist vigilantism has also been a recurring problem, as shown by the presence of someone like Rittenhouse – an admirer of the police – taking part in vigilantism.

 

So Rittenhouse does have his historical precedents – and if his lawyer wants to argue that, by attempting to repress the Black Lives Movement, Rittenhouse was acting in the spirit of those eighteenth-century militias which went outside the law and defied their state government, and especially those who did so in the interest of promoting white supremacy – his case would be fairly solid. It would not, however, be an exoneration. Far from it. What Rittenhouse and his lawyers cannot argue is that he was part of the well-regulated militia. Unlike the National Guard, which is under government authority, Rittenhouse was not acting as that militia had been legally required to act. Beyond that, though, the militia which the Second Amendment declared “necessary for the security of a free state” died a long time ago. 

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177404 https://historynewsnetwork.org/article/177404 0
Breaking Lincoln's Promise

 

 

 

“Why should I go to that cemetery? Its filled with losers,” President Donald Trump reportedly stated when justifying his unwillingness to visit the American war dead at the Aisne-Marne American cemetery in France. The cemetery commemorates soldiers from the First World War, many of whom died at the nearby battlefield at Belleau Wood. Mr. Trump’s supporters who attended the President’s trip to France in 2018, including current White House staffers, denied that the Commander-in-Chief of the U.S. military made such utterances. Jeffrey Goldberg, Editor-in-Chief of The Atlanticand author of the article, quotes four anonymous sources “with firsthand knowledge of the discussion that day.”  President Trump had denigrated American soldiers, veterans, and their families before and during his presidency including Senator John McCain, who died from cancer in 2018 and was a prisoner of war for over 5 years after his Navy plane was shot down over North Vietnam. Mr. Trump was accused of being disrespectful to Khizer Khan, a Gold Star Father, whose Muslim son, Captain Humayun Khan, died heroically protecting his fellow soldiers from a suicide attack in which he was posthumously awarded the Bronze Star and Purple Heart medals. 

 

Critics of the President refer to these moments to suggest that Donald Trump is unfit to be Commander-in-Chief while the President’s supporters use them to suggest that the media and liberals maliciously distort and lie about Mr. Trump to undermine his reelection bid. The controversy persists because the 2020 presidential election between Donald Trump and Joe Biden is taking place within the context of a war over American memory. President Abraham Lincoln crystalized American cultural memory when he dedicated the cemetery at Gettysburg in November 1863. Lincoln summoned the loss and grief of hundreds of thousands of Americans and harnessed their mourning to the bodies of Union soldiers, declaring “that these dead shall not have died in vain” but rather their sacrifices guaranteed “a new birth of freedom and that government of the people, by the people, for the people shall not perish from the earth.” He constituted what can be described as a promise to the dead that the nation would remember them and their collective sacrifices, and Lincoln obligated the living to guarantee this promise. More than a justification to continue the war, Lincoln activated a hot memory that cultivated a culture of yoking America’s slave past to the present democratic underpinnings of the Civil War. 

 

The hot memory that Lincoln initiated was short-lived. The intensification of the postwar Confederate lost cause movement, coupled with the eventual de-facto and then legal segregation codified within Jim Crow laws helped cool American cultural memory by denying the role that slavery played in American history thus effectively separating the past from the present. American cultural memory of the Civil War continued to cool in the first half of the twentieth century, allowing Americans to ignore their domestic racist past, which enabled their imperialist ambitions and interests across the Atlantic during the First World War. American soldiers buried in the Aisne-Marne cemetery in France died in segregated combat units in the cause of expanding American influence in Europe. In his Memorial Day speech in 1919, President Woodrow Wilson stood in the Suresnes military cemetery and explicitly (without noting the irony) tied the war dead from the First World War to the war dead of the Civil War. He claimed that the Civil War dead died to create a new nation and the First World War dead sacrificed their lives to create a new Europe. Wilson succeeded in incorporating the First World War dead into Lincoln’s promise and likewise obligated the living to remember them collectively as a noble brotherhood of American sacrifice. 

 

As the United States expanded its interests across the Atlantic, Americans could not sustain the chilling effect on cultural memory. The hypocrisy of segregation, especially in the eyes of the nations Americans were trying to influence, manifested itself through the onset of redlining residential neighborhoods, the death of Emmett Till, and the marches in Birmingham and Selma. This duplicity became acute during the Cold War when Communist nations used American segregation to undermine the United States’ spread of democracy. Inside the U.S., the Civil Rights Movement of the 1950s and 60s attempted to heat American cultural memory again by calling for democratic reforms that would incorporate African Americans and others into the American dream. Civil rights leaders marched and protested, in part, to remind Americans of how much their collective past, especially when it came to slavery, was intertwined with the segregationist policies of the present. While this movement succeeded in thawing and even reheating cultural memory, its efficacy waned as the twentieth century closed. The logic of the Cold War, the imperialistic nature of the Vietnam conflict, and the economic stagnation of the 1970s helped cool American memory again. This cooling effect continued as the end of the Cold War allowed Americans to embrace the present, to celebrate their defeat of Communism, which helped distract them from completing the program of racial equality. This cooling trend continued into the twenty-first century with the invasion of Iraq and Afghanistan in the war on terror pushing the American present further and further away from dealing with the traumatic and difficult aspects of the American past.

 

Donald Trump emerged as a presidential candidate taking advantage of this already frozen memory. President Trump’s denigration of dead soldiers is part of this movement to “drain the swamp” and is a disruption of Lincoln’s promise and of the obligation on which American commemorative traditions have been built. He is not just attacking an individual politician who had been a Navy pilot or a Gold Star Father who spoke out against his presidential candidacy this time, he is calling the war dead collective “losers” and “suckers” including the war dead that Lincoln claimed “gave their lives that that nation might live.” 

 

The first Republican President reminded his audience in Pennsylvania that “the world will little note, nor long remember what we say here, but it can never forget what they did here.” The current Republican President’s personal memory of the past is not unique rather, it reflects the cultural memory of America, cold, presentist, and amnesiac. His unwillingness to remember the war dead in Aisne-Marne or apparently even know “who were the good guys in this war” is symptomatic of a larger cultural amnesia that has been around since the end of the Cold War. American cultural memory, in its frozen state, cannot accommodate the past. Americans must warm their collective memories of the past if they want to navigate the present. They must bring the past into the present in a way that allows for an honest discussion of the past so that they can make informed decisions about the present. Abraham Lincoln understood this when he spoke at Gettysburg. Donald Trump’s refusal to commemorate the war dead at Aisne-Marne illustrates not only how far the Republican party has shifted away from Lincoln but it also suggests just how far American society has moved away from its obligation to remember the war dead. Lincoln’s promise to the dead seems unfulfilled if not wholly broken. 

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177415 https://historynewsnetwork.org/article/177415 0
Woody Guthrie's Communism and "This Land Is Your Land"

 

 

 

Just before her high wire performance at the 2016 Super Bowl, Lady Gaga sang the opening lines of Irving Berlin’s “God Bless America,” before segueing into Woody Guthrie’s “This Land is Your Land.” That Guthrie’s song was written in angry response to Berlin, and that its incorporation into such a corporate spectacle likely would have caused Woody no small amount of distress, appears to have been something Gaga was unaware of. Such confusion is not especially unique. The song has been mired in ambiguity for decades.

 

Woody Guthrie’s inspiration for the song came as he traveled from California to New York in the winter of 1940. At the time there was no getting away from Berlin’s song, which was everywhere on the radio, sung by Kate Smith — someone back in the news, not for that composition, but for singing songs with racist content. When Guthrie finally reached New York he sat down and wrote, “God Blessed America for Me,” which would become “This Land is Your Land” — with its melody taken from the Carter Family’s “When the World’s on Fire.” Guthrie’s song, rather than extoling God’s special relationship with the United States, asked how it could be that He blessed a country where people were standing in the relief lines, hungry and out of work. It was, to say the least, not a barnstormer of unabashed patriotism.

 

Guthrie & the Communist Party

 

To the degree people know the politics of Woody Guthrie today he is thought of as an advocate for social justice, with a degree of association with mid-century US communism, but mainly a free spirit who travelled among and gave voice to the dispossessed in the United States. The matter of whether or not he was an actual member of the Communist Party USA (CPUSA) has long been debated, with the consensus being he was simply not party material. 

 

On a general level this is correct, but not wholly so. That is because not only was Guthrie a close supporter of the party, there is strong evidence he was a member of the group for a short time in the early 1940s, before being dropped because of discipline issues. Not only that, he unsuccessfully reapplied to the group during World War II, but was rejected, according to his second wife Marjorie Guthrie, something she revealed in Oregon’s Northwest Magazine in 1969. 

 

This is different from the prevailing view, most forcefully argued in Ed Cray’s Ramblin’ Man: The Life and Times of Woody Guthrie, which cites numerous friends and former comrades who claim Guthrie was not, and could never have been, in the Party. Cray however, contradicts himself in a footnote: 

 

[The writer] Gordon Friesen, on the other hand, maintained that Guthrie was a member of the Communist Party briefly in 1942. It ended sometime in the summer months. Friesen wrote, after Guthrie was summoned by “his organizer” to “a branch meeting” in Greenwich Village. Guthrie was to answer charges of lack of discipline… He had pledged to appear at a certain Village street corner to sell Daily Workers and then had failed to show up.

 

Cray found this anecdote in a letter from Friesen to historian Richard Reuss, the author of  the seminal work, American Folk Music and Left-Wing Politics: 1927-1957.

 

What is notable about this story is that it is one that had been told elsewhere, by music critic Robert Shelton in his biography of Bob Dylan. Shelton says Friesen wrote to him directly, remarking on how badly the CPUSA treated artists, citing Guthrie as an example, “I remembered Woody showing me a letter from his section organizer in the Village ordering him to appear to answer charges for ‘lack of discipline’ because he had failed to show up at a certain corner to sell the Daily Worker.”

 

Both these stories, in turn, align with Pete Seeger’s recollections. According to Seeger, who spoke with Robert Santelli for his book, Hard Travelin: The Life and Legacy of Woody Guthrie, “Woody considered himself a communist. Was he a member of the Party? Not really.” According to Seeger, “The Party turned him down. He was always traveling here and there. He wasn’t the kind of guy you’d give a Party assignment to. He was a valued fellow traveler.” Seeger, however, continues:

 

On the other hand, Sis Cunningham, who was a much more disciplined person than either me or Woody, was in a Greenwich Village Branch of the Party. She got Woody in. She probably said, I’ll see Woody acts responsibly.’ And so Woody was briefly in the Communist Party.

 

Sis Cunningham, it should be noted, was married to Gordon Friesen, and was one of the Almanac Singers in New York before World War II, along with Guthrie, Seeger, Bess Lomax, Millard Lampell and others. That aside, Seeger’s characterization matches with the information above.

 

Dues paying or not, Guthrie was a committed partisan of the CPUSA. As Los Angeles Party leader Dorothy Healey put it, “If he wasn’t a Party member, he was the closest thing to it.” This is important not only factually, but for what it says about the attention the FBI directed at him. The Bureau tracked Woody Guthrie for over three decades, compiling files adding up to 593 pages. If his association with the Party was as tenuous as some have claimed, then the FBI was seriously off track in the attention they gave him. From their perspective, pursuing people with serious ties to US communism made sense. Which is not to say it was just. Guthrie was not breaking any laws or otherwise engaged in activity meriting such attention. In fact it is abominable that the FBI continued to monitor him for years after his diagnosis with Huntington’s Chorea, as he was losing his ability to walk and even speak.

 

A Song With Many Meanings

 

Woody Guthrie met the Communist Party in Los Angeles in 1939, while working at Radio Station KFVD. There he met Ed Robbin, a writer for Peoples World, the party’s West Coast newspaper. One of the first things Guthrie did after meeting Robbin was to write a song about Tom Mooney a labor organizer who had been imprisoned for allegedly bombing a “Preparedness Day” parade held in preparation of the US entering World War I. Guthrie’s first partisan act in that respect was to write “Tom Mooney is Free,” on the occasion of Mooney’s release. A greater example of his partisanship came in the lyric, “Why Do You Stand There in the Rain?” which he wrote in response to Franklin Roosevelt’s scolding of a youth rally — that included communists — held soon after the USSR went to war against Finland. Among other things the lyrics take a strong anti-war stand consistent with the CP’s slogan at the time, “The Yanks Aren’t Coming” — a position they held during the  non-aggression pact between Germany and the USSR. Guthrie, in other words, was already incorporating the political line of the Communist Party into his lyrics when he sat down at actor Will Geer’s house in New York to write what would become “This Land is Your Land.”

 

In that respect, in order to better understand the song, one needs to understand the peculiarity of the CPUSA under the leadership of Earl Browder. A major slogan of the CP when Woody came on the scene was, “Communism is 20th Century Americanism.” That slogan was in keeping with Browder’s attempt to create a big tent for communism in the United States, steeped in anti-fascism and social-democracy. That the slogan was a mash-up of communism and US exceptionalism helps explain why Guthrie’s song stops in New York, rather than going on to the wider world  — communism, after all, was supposed to be internationalism. Browder and the CP’s approach to communism was far more U.S.-centric than internationalist —except of course when it came to supporting the geopolitical dictates of the Soviet Union. All of which explains the orientation of the song. 

 

Depending on the listener, “This Land is Your Land” can be heard in different ways. Guthrie most likely intended it as a call to move beyond private property, and toward a greater equality and common humanity. Notably the lines usually excluded, talk about encountering a sign reading, “No Trespassing,” while the other side of the sign, “didn’t say nothing.” — that being the sign that was “made for you and me.” 

 

More moderately it can be heard  as a liberal-secular hymn, in which all people ought to share in the country’s bounty. That is why Pete Seeger and Bruce Springsteen could safely perform it at Barack Obama’s first inaugural celebration.

 

Alternately still, and not without basis, it can be heard as a proclamation of American chauvinism — in fact the song has been criticized as justification of manifest destiny and even the theft of native lands because of its lines extolling the US landscape achieved through no small amount of blood and conquest.

 

Such debate will likely never get fully resolved. However, given the politics of its author, and in the interests of showing a little respect for the dead, it would seem a modest request — all due respect to Lady Gaga — to ask that the song not again be sung as part of a medley with “God Bless America.”

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177412 https://historynewsnetwork.org/article/177412 0
Ruth Bader Ginsburg Helped Shape the Modern Era of Women’s Rights – Even Before She Went on the Supreme Court  

Judge Ruth Bader Ginsburg paying a courtesy call on Sen. Daniel Patrick Moynihan, D-N.Y., left, and Sen. Joseph Biden, D-Del., in June 1993, before her confirmation hearing for the Supreme Court. AP/Marcy Nighswander

Jonathan Entin, Case Western Reserve University

Justice Ruth Bader Ginsburg died on Friday, the Supreme Court announced.

Chief Justice John Roberts said in a statement that “Our nation has lost a jurist of historic stature.”

Even before her appointment, she had reshaped American law. When he nominated Ginsburg to the Supreme Court, President Bill Clinton compared her legal work on behalf of women to the epochal work of Thurgood Marshall on behalf of African-Americans.

The comparison was entirely appropriate: As Marshall oversaw the legal strategy that culminated in Brown v. Board of Education, the 1954 case that outlawed segregated schools, Ginsburg coordinated a similar effort against sex discrimination.

Decades before she joined the court, Ginsburg’s work as an attorney in the 1970s fundamentally changed the Supreme Court’s approach to women’s rights, and the modern skepticism about sex-based policies stems in no small way from her lawyering. Ginsburg’s work helped to change the way we all think about women – and men, for that matter.

I’m a legal scholar who studies social reform movements and I served as a law clerk to Ginsburg when she was an appeals court judge. In my opinion – as remarkable as Marshall’s work on behalf of African-Americans was – in some ways Ginsburg faced more daunting prospects when she started.

Thurgood Marshall, in 1955, when he was the chief counsel for the NAACP. AP/Marty Lederhandler

Starting at zero

When Marshall began challenging segregation in the 1930s, the Supreme Court had rejected some forms of racial discrimination even though it had upheld segregation.

When Ginsburg started her work in the 1960s, the Supreme Court had never invalidated any type of sex-based rule. Worse, it had rejected every challenge to laws that treated women worse than men.

For instance, in 1873, the court allowed Illinois authorities to ban Myra Bradwell from becoming a lawyer because she was a woman. Justice Joseph P. Bradley, widely viewed as a progressive, wrote that women were too fragile to be lawyers: “The paramount destiny and mission of woman are to fulfil the noble and benign offices of wife and mother. This is the law of the Creator.”

And in 1908, the court upheld an Oregon law that limited the number of hours that women – but not men – could work. The opinion relied heavily on a famous brief submitted by Louis Brandeis to support the notion that women needed protection to avoid harming their reproductive function.

As late as 1961, the court upheld a Florida law that for all practical purposes kept women from serving on juries because they were “the center of the home and family life” and therefore need not incur the burden of jury service.

Challenging paternalistic notions

Ginsburg followed Marshall’s approach to promote women’s rights – despite some important differences between segregation and gender discrimination.

Segregation rested on the racist notion that Black people were less than fully human and deserved to be treated like animals. Gender discrimination reflected paternalistic notions of female frailty. Those notions placed women on a pedestal – but also denied them opportunities.

Either way, though, Black Americans and women got the short end of the stick.

Ginsburg started with a seemingly inconsequential case. Reed v. Reed challenged an Idaho law requiring probate courts to appoint men to administer estates, even if there were a qualified woman who could perform that task.

Sally and Cecil Reed, the long-divorced parents of a teenage son who committed suicide while in his father’s custody, both applied to administer the boy’s tiny estate.

The probate judge appointed the father as required by state law. Sally Reed appealed the case all the way to the Supreme Court.

Ginsburg did not argue the case, but wrote the brief that persuaded a unanimous court in 1971 to invalidate the state’s preference for males. As the court’s decision stated, that preference was “the very kind of arbitrary legislative choice forbidden by the Equal Protection Clause of the 14th Amendment.”

Two years later, Ginsburg won in her first appearance before the Supreme Court. She appeared on behalf of Air Force Lt. Sharron Frontiero. Frontiero was required by federal law to prove that her husband, Joseph, was dependent on her for at least half his economic support in order to qualify for housing, medical and dental benefits.

If Joseph Frontiero had been the soldier, the couple would have automatically qualified for those benefits. Ginsburg argued that sex-based classifications such as the one Sharron Frontiero challenged should be treated the same as the now-discredited race-based policies.

By an 8–1 vote, the court in Frontiero v. Richardson agreed that this sex-based rule was unconstitutional. But the justices could not agree on the legal test to use for evaluating the constitutionality of sex-based policies.

New York Times article about the Wiesenfeld case, which refers to Ginsburg as ‘a woman lawyer.’ New York Times

Strategy: Represent men

In 1974, Ginsburg suffered her only loss in the Supreme Court, in a case that she entered at the last minute.

Mel Kahn, a Florida widower, asked for the property tax exemption that state law allowed only to widows. The Florida courts ruled against him.

Ginsburg, working with the national ACLU, stepped in after the local affiliate brought the case to the Supreme Court. But a closely divided court upheld the exemption as compensation for women who had suffered economic discrimination over the years.

Despite the unfavorable result, the Kahn case showed an important aspect of Ginsburg’s approach: her willingness to work on behalf of men challenging gender discrimination. She reasoned that rigid attitudes about sex roles could harm everyone and that the all-male Supreme Court might more easily get the point in cases involving male plaintiffs.

She turned out to be correct, just not in the Kahn case.

Ginsburg represented widower Stephen Wiesenfeld in challenging a Social Security Act provision that provided parental benefits only to widows with minor children.

Wiesenfeld’s wife had died in childbirth, so he was denied benefits even though he faced all of the challenges of single parenthood that a mother would have faced. The Supreme Court gave Wiesenfeld and Ginsburg a win in 1975, unanimously ruling that sex-based distinction unconstitutional.

And two years later, Ginsburg successfully represented Leon Goldfarb in his challenge to another sex-based provision of the Social Security Act: Widows automatically received survivor’s benefits on the death of their husbands. But widowers could receive such benefits only if the men could prove that they were financially dependent on their wives’ earnings.

Ginsburg also wrote an influential brief in Craig v. Boren, the 1976 case that established the current standard for evaluating the constitutionality of sex-based laws.

Like Wiesenfeld and Goldfarb, the challengers in the Craig case were men. Their claim seemed trivial: They objected to an Oklahoma law that allowed women to buy low-alcohol beer at age 18 but required men to be 21 to buy the same product.

But this deceptively simple case illustrated the vices of sex stereotypes: Aggressive men (and boys) drink and drive, women (and girls) are demure passengers. And those stereotypes affected everyone’s behavior, including the enforcement decisions of police officers.

Under the standard delineated by the justices in the Boren case, such a law can be justified only if it is substantially related to an important governmental interest.

Among the few laws that satisfied this test was a California law that punished sex with an underage female but not with an underage male as a way to reduce the risk of teen pregnancy.

These are only some of the Supreme Court cases in which Ginsburg played a prominent part as a lawyer. She handled many lower-court cases as well. She had plenty of help along the way, but everyone recognized her as the key strategist.

In the century before Ginsburg won the Reed case, the Supreme Court never met a gender classification that it didn’t like. Since then, sex-based policies usually have been struck down.

I believe President Clinton was absolutely right in comparing Ruth Bader Ginsburg’s efforts to those of Thurgood Marshall, and in appointing her to the Supreme Court.

Jonathan Entin, Professor Emeritus of Law and Adjunct Professor of Political Science, Case Western Reserve University

This article is republished from The Conversation under a Creative Commons license. Read the original article.

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177417 https://historynewsnetwork.org/article/177417 0
Historians Respond to the Death of Justice Ruth Bader Ginsburg

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177418 https://historynewsnetwork.org/article/177418 0
Nostalgia and the Tragedy of Trump's Speech at Mount Rushmore

 

 

 

Standing in front of Mount Rushmore on July 3, Donald Trump offered the American people a recitation of their history that was not only stirring but saturated with nostalgia.  With visages of notable presidents looming behind him, he presented a magical tale of a wonderful nation pursuing its manifest destiny and the heroes who helped it along.  Magnificent achievements abounded; victims and aggression were nowhere to be found.  His oration perfectly underscored David Lowenthal's point that nostalgia is memory with the pain removed. The pain is in the present. 

 

In Trump's sentimental and self-satisfying rendering of our past, Americans emerge as a people beyond reproach--free of any hint of original sin or culpability for sordid deeds.  He legitimately praised the laudable actions of America's doctors and nurses who fought the corona virus, the "wonderful veterans" who fought our wars and the law enforcers who patrol our streets.  He stood in awe of George Washington for persevering through the  bitter winter at Valley Forge, of Thomas Jefferson for authoring the Declaration of Independence, calling it "one of the greatest treasures of human history," and of Abraham Lincoln for saving the union and ending slavery.  He paid tribute to "our founders" for launching not only a revolution in government, but a revolution in the pursuit of "justice, liberty, and prosperity."  He even found space in his pantheon of heroes for Muhammad Ali, Buffalo Bill Cody,  Frederick Douglass, Andrew Jackson, General George Patton, and  Elvis Presley. Certainly there was much that Americans could be proud of in this particular history lesson but Trump's omissions were glaring. The ache of Indian removal, Jim Crow, lynchings, race riots, labor exploitation, Vietnam, Iraq,  sexual abuse, and environmental degradation were left out of this version of who we were.

                  

Soon, however, Trump's wistful saga turned ominous.  The president declared that the future of the "most just and exceptional nation ever to exist on earth" was in peril, threatened by forces that emanated not from distant shores but from within.  He warned his listeners of dangers that jeopardized "every blessing our ancestors fought so hard for."  America was now threatened by  "angry mobs" who were attempting to deface "our most sacred memorials and unleash a wave of violent crime in our cities."  Entrenched in urban enclaves run by "liberal democrats," these hordes represented the vanguard of a "new far-left fascism" intolerant of contradictory views who were now spreading a form of totalitarianism in our schools and newsrooms and threatening to unravel the very revolution that gave birth to the nation in the first place.  In an age when Americans had come to fear Muslims from abroad, immigrants with criminal intent, and a deadly virus, Trump now insisted that the real existential threat to America and its glorious history came from domestic terrorists who opposed him politically and plotted to "defame our heroes, erase our values, and indoctrinate our children."    

 

Trump's speech offered a classic case of what scholars such as Svetlana Boym have called "restorative nostalgia," a highly emotional impulse that longs to flee from the incessant swirl of change  and tension in the present and find truth and comfort in a romanticized past free of turmoil and trauma.  Such wistfulness is popular in our time because it promises to serve as a source of enduring truths--suspending the need for critical thinking--and free the virtuous nation from any responsibility to address wrongs it may have committed. In Trump's retelling of American history, in fact, there were no misdeeds.  He offered a memory and a history free of  guilt and certainly without any evidence that might support claims for justice in the present. 

 

Boym also identified a contrasting form of nostalgia that she claimed was more "reflective." In this particular turn to the past, facts are prized more than myths and a careful assessment is made of what has worked and what has failed so that improvements or reforms can be made going forward. This more thoughtful nostalgia longs not for the return of paradise but the implementation of lessons learned from a history filled with forward and backward thrusts. It serves as an antidote the lure of utopia or an "unreflected nostalgia," which can breed "monsters" or evil forces like "angry mobs."  Senator Joseph McCarthy's warnings in 1950 that there were communists or "men in high places" lurking  in the government that threatened American values was a prime example of this turn to "monsters."  When "mobs" appear in the streets defenders of paradise can never see their grievances as legitimate because there can be no reason to critique what is already exceptional and faultless.  The only recourse is to quash the mob, drain the swamp, demand a restoration of law and order and reaffirm traditional values.  Negotiations would be waste of time. 

 

Ironically, Abraham Lincoln, whose monument glared down at Trump as he spoke in South Dakota, turned out be a nostalgic as well, but one that relied more on a sober analysis of the past he longed for than a wistful one.  Drawing upon his experience from leading the nation through the Civil War, the sixteenth president yearned earlier times when America was peaceful and united.  As most historians will argue, Lincoln fought the war to save the union. But Lincoln, drawing upon his view of  history, saw the fight to save the United States as much more than a preservation project. He felt Americans had a debt to their forefathers to sustain not only the nation they created but what he called America's "first principles."  He shared the sense that the nation was exceptional but not because it produced heroic figures but because it was born with a moral obligation to promote and protect the ideal of universal liberty and equal rights for all. 

 

Lincoln did not always believe as fully in these ideals as he might have. Before the Civil War he often felt the best solution to the race problem in America was for African-Americans to return to their home continent.  But over time--as he reflected upon the bitter disputes over slavery that had divided America before the 1860s and upon the human sacrifices of the war--he came to see clearly that the task before him--and for Americans who followed him--was to save the union so that it could continue to serve as a moral agent pressing for tolerance and equal rights for men and women everywhere. To Lincoln, America's destiny--shaped by its history--was less about producing heroes or quashing imagined conspiracies than ensuring that all Americans had access to their birthright of equality. That is why his famous speech at Gettysburg mentioned "our forefathers," the pain of solider deaths, and the continuing need to see that a government of the people should not perish from the face of the earth.  

                  

Trump's story of a flawless people and nation would not have resonated with Lincoln because it failed to emphasize the central tenet of the American experience that all human beings needed to be seen in the light of the dignity they possessed and the rights they deserved. Trump did praise Jefferson and the Declaration of Independence in his talk but then he buried them in a convoluted tale of heroes, destiny, statues,  mobs, and violent cities that left the impression that his need to destroy political opponents took precedence over everything else.  He showed no interest in endorsing Lincoln's point that the government and the nation needed to be saved so that it could continue the progressive dream of expanding human rights into the foreseeable future.  In Trump's future he planned to build a monument to all of America's heroes and ventured that centuries from now our legacy would be seen in the cities we built and the "champions we forged."  The tragedy of his nostalgia was not only that it failed to contemplate the calamities of the history he told but that his vision for the nation had drifted so far from the razor-sharp devotion Lincoln had to human rights.  

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177410 https://historynewsnetwork.org/article/177410 0
Dwight Eisenhower Built up American Intelligence at a Crucial Moment

Soviet officer inspects CIA tunnel under East Berlin, 1956. Photo Bundesarchiv, Bild. CC BY-SA 3.0 de

 

 

More than any other president--with the possible exception of George Washington--Dwight D. Eisenhower did not need on-the-job training to understand the value of good intelligence. As Supreme Allied Commander in Europe during World War II, Eisenhower relied heavily on Ultra, the British code-breaking operation that allowed the Allies to read encrypted German communications. At the war’s conclusion, Eisenhower said the intelligence had been “of priceless value to me.”

So it was with no little chagrin that upon taking office in Washington in January 1953, Eisenhower learned just how far western intelligence had declined since the war.

This week, the Dwight D. Eisenhower Memorial is being dedicated in Washington, not far from the U.S. Capitol. Eisenhower’s presidency is sometimes overshadowed by his wartime command. One aspect in particular that is often overlooked is how vastly US intelligence capabilities increased during his administration.

Eisenhower took office at a uniquely vulnerable period in American history. The Soviet Union had already shocked the West in 1949 by successfully testing a nuclear bomb after stealing Manhattan Project secrets. In August 1953, the Soviets detonated their first hydrogen bomb. It was an unpleasant surprise for Eisenhower—Western intelligence had no inkling the Soviets would achieve such destructive capability so quickly.  On top of the nuclear threat, an enormous Red Army force –never withdrawn from Eastern Europe at the end of World War II--remained poised along the borders with Western Europe.

But as Eisenhower soon learned, the U.S. had virtually no good intelligence on the Soviet Union. For some years during and after World War II, the U.S. intercepted and decrypted secret Soviet radio communications as part of a secret program codenamed VENONA. But in 1948, after the secret was betrayed by KGB spies Kim Philby and William Weisband, the Soviets changed their cryptographic systems and shifted much of their communications from radio to landlines, leaving the West almost entirely in the dark about Moscow’s military capabilities and intentions. CIA efforts to place agents inside the Soviet Union had failed miserably. Other than rare overflights along the periphery of Soviet territory by U.S. and British military aircraft, there was none of the overhead imagery that the U-2 and satellites would later provide. “We were simply blind,” said David Murphy, a CIA officer who would serve in Berlin.

Not long after the Soviet hydrogen bomb test, CIA Director Allen Dulles told Eisenhower “the Russians could launch an atomic attack on the United States tomorrow.” It left the president wondering whether he should consider launching a first strike to preempt the Soviets. “As of now, the world is racing toward catastrophe,” he wrote gloomily in his diary.

So Eisenhower was receptive when Dulles brought him a proposal soon afterwards for what would become one of the most audacious espionage operations of the Cold War, involving the divided city of Berlin. The idea was to team with British intelligence to dig a quarter-mile tunnel from West Berlin into East Berlin to tap into underground cables used by the Red Army to communicate with Moscow. The Berlin Tunnel, as recounted in Betrayal in Berlin, would be simultaneously the largest signals intelligence and covert operation the CIA had conducted to that point--not to mention effectively an incursion into Soviet-held territory.

Eisenhower gave the tunnel his ready approval, as did his former partner from World War II, Winston Churchill—who had returned to power in 1952 as Britain’s prime minister and was likewise dismayed at the lack of intelligence about the Soviet military. 

As president, Eisenhower pushed for aggressive intelligence gathering--within limits. “In general we should be as unprovocative as possible but he was willing to take some risks,” Andrew Goodpaster, then an Army colonel serving as Eisenhower’s staff secretary, later recalled. Eisenhower refrained from asking too many questions about exactly what the CIA was up to in Berlin. “He insisted that he have access to everything, and I think we did,” said Goodpaster. “But there were things that he deliberately did not inform himself about.” Eisenhower liked having plausible deniability, to guard against having to lie to the press or Congress about what he had known.

“President Eisenhower did not feel that he wanted to know the specifics of all these activities,” recalled Dillon Anderson, who served as Eisenhower’s national security advisor. “I don’t think he particularly wanted to know” the elaborate details of how the CIA intended to tunnel into East Berlin, Anderson said. But the president was keenly interested in the end product.

Construction of the tunnel began in great secrecy in September 1954, dug by a small U.S. Army Corps of Engineers team. They used the cover of an Army warehouse in Rudow, a remote corner of the American sector, to disguise the project from curious Soviet and East German guards across the nearby border. As work continued, Dulles came to the president seeking authorization for another secret program, this one to develop a special high-altitude reconnaissance aircraft that would become known as the U-2. Once again, Eisenhower approved without hesitation. “Our relative position in intelligence, compared to the Soviets, could scarcely have been worse,” he later wrote. Bigger and better fleets of bombers and improved guided missile capability had given the Soviets an “ever-growing capacity for launching surprise attacks against the United States,” Eisenhower believed. He admitted to being “haunted” by the threat of a nuclear Pearl Harbor and created two commissions in 1954 to examine the ability of U.S. intelligence to protect the nation against such an attack. The first report, a review of CIA covert operations led by Lieutenant General James Doolittle, hero of the wartime raid on Tokyo, described the U.S. as losing an intelligence battle that could have apocalyptic consequences: “If the United States is to survive, long-standing American concepts of 'fair play’ must be reconsidered,” Doolittle wrote. The second commission, headed by MIT President James Killian, was more sober-minded but equally chilling in its conclusions. “The advantage of surprise attack has never been so great as now,” the Killian report said.

Good intelligence could not come too soon, as far as Eisenhower was concerned. “Our old conceptions of the time that would be available to governments for making of decisions in the event of attack are no longer tenable,” Eisenhower wrote to Churchill in January 1955. “I think it possible that the very life of a nation, perhaps even of Western civilization, could . . . hang upon minutes and seconds used decisively at top speed or tragically wasted in indecision.” 

In May 1955, after eight months of delicate work, the American and British tunnel team succeeded in tapping the first of the targeted trunk cables, located 27 inches below the surface of a heavily traveled East Berlin road. Back in the warehouse, a team of linguists and analysts was quickly inundated by a gusher of captured telephone calls and teletype communications, involving everyone from senior Soviet commanders to low-ranking logistician clerks across East Germany. Cartons filled with tape recordings were soon being flown almost every day to processing centers in London and Washington staffed by hundreds of translators, transcribers and analysts.

Bit by bit, a mosaic was painstakingly painted of the Soviet military—its organization, deployment of forces, strength and weaknesses, training, tactics, weaponry, radio and telephone networks and system of passwords. The captured conversations also revealed details about the Soviet nuclear program, Kremlin machinations, and Soviet intelligence operations and much other critical information. For Eisenhower, though, the greatest value of the tunnel lay in what it did not show: any indication that the Soviets were planning an attack. The preemptive strike Eisenhower feared he would need to launch never happened.

Ironically, the KGB had been tipped to plans for the tunnel by George Blake, a British intelligence officer involved in the project. But the KGB found itself in a dilemma. Blake was proving himself invaluable as a Soviet spy, and if the KGB did anything to stop the tunnel, he would immediately fall under suspicion, as one of only a handful who knew of the operation. Planting disinformation was likewise too risky, because it would stick out like a sore thumb amidst the torrent of real information captured by the tunnel. So the KGB left Red Army commanders in the dark. By the time the Soviets finally staged a discovery in April 1956, the tunnel had intercepted some 90,000 communications. In a sense, it was a precursor to the mass surveillance that would be employed by the National Security Agency.

Less than three months after the tunnel’s demise, a U-2 took off from West Germany on July 4, 1956, for the first overflight of Soviet territory. It was almost like handing off a baton. For nearly a year, the tunnel had provided the early warning the U.S. and its allies needed. Now, the U-2, with high resolution cameras able to cover vast amounts of territory from high altitudes, would be able to track the movement of military equipment, weaponry, troops and other logistical signs that might signal plans for an attack. The downing of a spy plane over the USSR in 1960 would prove deeply embarrassing to Eisenhower, when the Soviets exposed White House denials of espionage as lies by producing captured CIA pilot Francis Gary Powers. But despite Eisenhower’s regret over a cancelled summit meeting with Soviet leader Nikita Khrushchev, the value of the U-2 would be indisputably proven two years later when an overflight spotted Soviet nuclear missiles in Cuba.

Before Eisenhower left office in January 1961, the world’s first photo reconnaissance satellites had been launched as part of CORONA, another secret program the president had authorized. CORONA revolutionized intelligence collection, providing the CIA with the capability of scanning the globe. While many American intelligence disasters lay ahead, from the Bay of Pigs to Iraq and beyond, never again would the U.S. be as utterly blind as it had been when Eisenhower took office.

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177379 https://historynewsnetwork.org/article/177379 0
Rick Perlstein’s Reaganland: America’s Right Turn, 1976-1980

Cover Detail, Rick Perlstein's Reaganland, Simon & Schuster

 

Here are two key things to know about Rick Perlstein’s new book, Reaganland: 

First, despite the title, the book is much more than a political biography of Reagan. Covering the years 1976-1980, this is the latest in Perlstein’s four-volume series on the rise of the American conservative movement.  In fact, much of Reaganland details the rise and fall of the Carter administration, which, of course, made Reagan’s election in 1980 possible.

Second, while Donald Trump is mentioned only once (as a young real estate developer buying a Manhattan hotel with government subsidies), this book goes a long way to explaining the rise of the conservative constituencies (e.g. blue-collar workers, evangelicals) that made his 2016 election possible.

As Reaganland sets forth, many of the tactics used by Trump’s campaign were invented or perfected by Reagan’s skilled team of advertising and PR professionals. These include thinly veiled racist language to appeal to “aggrieved” white voters, careful cultivation of evangelical Christians and foot-stomping, flag-waving rallies in blue-collar cities.   

Although the phrase “Make American Great Again” was used in Reagan’s campaign, it was just one of a dozen different messages, all implying the nation had declined under Jimmy Carter. The all-white crowds that attended Reagan’s rallies often chanted “Reagan’s right! Reagan’s right!”

Some of the issues in the 1980 campaign would be familiar to today’s voters, such as the loss of manufacturing jobs, increased crime in big cities and a perceived weakness in the military. Other issues, such as Carter’s decision to cede control of the Panama Canal, the SALT II nuclear arms talks and the campaign for the Equal Rights Amendment, are topics that have faded out of sight.

A Pre-Internet Age

To read Reaganland is to be immersed in the world of 1979, which where the pace of life, at least in terms of media consumption, was far more leisurely. Mass communication was limited to the three TV networks and your local daily newspaper. Cable TV was generally limited to distant rural areas and the Cable News Network did not produce its first newscast until November 1980. The worldwide web, smart phones and streaming media were the stuff of science fiction. Most phones used rotary dials and music lovers purchased vinyl albums. 

For political campaigns, this meant a relentless focus on staging a series of colorful campaign events to gain coverage in local newspapers and TV stations. Print was the dominant medium, with some 1,800 daily newspapers across America, all of them influencers on local voters. 

In this pre-Internet age, the only way to reach directly into American homes with an unfiltered message was through the mail — and a skilled right-wing marketer gave conservative causes a major advantage. Richard Viguerie perfected the art of direct mail over twenty-five years; he was dubbed “the six-million-dollar man” for his ability to rake in huge sums with a single, carefully targeted mailing 

This fundraising tool was just one of the factors that gave the conservative movement new power that enabled it to fracture the traditional liberal coalition of labor unions, urban dwellers and white-collar workers. Perlstein describes the rise of a right-wing “counterintelligentsia,” including the American Enterprise Institute, the Heritage Foundation and the American Conservative Union. These groups churned out fact sheets, position papers and editorials supporting “free enterprise” and “reduced government intervention.”  

While candidate Reagan usually projected an image of sunny optimism, his campaign staff was busy appealing to the darker recesses of the American psyche. Perlstein notes “Reagan’s managers were targeting voters who felt victimized by government actions that cost them the privileges their whiteness once afforded them.”

This climate of white racial grievance was fostered by Conservative-leaning newspapers, like The Chicago Tribune, which ran dozens of stories in 1979 about “welfare queens,” usually Black mothers, who “grew comfortable living off the public purse.”

A National Election Studies poll in 1980 found that the number of Americans who supported increased spending to improve the conditions of minorities fell to 24%, a record low. Instead, 46% of those surveyed said minorities should instead “help themselves” out of poverty.

Other campaign messages appealed to the fears of suburban women and evangelical Christians. Campaign surrogates spoke of “threats” to the American family in the form of homosexual teachers, “women’s libbers,” abortion mills and rampaging criminals from the inner city.

Carter’s mistakes

Reagan benefited enormously from President Carter’s many stumbles, some self-inflicted and others due to outside forces. In the summer of 1979, a gasoline shortage swept across America. Kicked-off by increasing worldwide consumption and then exacerbated by an American trucking strike, long lines formed at service stations. Frustrated drivers demanded Carter end the shortage.  

Then in November 1979, Iranian militants seized 52 American diplomats at the embassy in Teheran. Carter’s favorability ratings plummeted. At the July 1980 Democratic Convention, Carter barely survived a challenge from Senator Ted Kennedy. Moderate Republican Congressman John Anderson ran as an independent candidate and attracted independent voters by advocating a 50 cent per gallon gasoline tax. On election night, Reagan swept to victory winning 51% of the popular vote to Carter’s 41% and gaining a landslide in the electoral college: 489 to 39. 

Perlstein’s great strength is weaving together a compelling narrative from a range of disparate sources. He is particularly skillful at showing how popular culture can influence voter concerns. Thus, Reaganland includes references to Eric Clapton, Tom Wolfe, Jane Fonda and movies such as The China Syndrome, Star Wars and The Godfather. 

One of the book’s failings, however, is a superficial treatment of Reagan the man. His two younger children, Patti Davis and Ronald Reagan, Jr. are never mentioned. Nancy Reagan is treated as a lightweight who merely organized parties. Several other biographers shown that she had a major influence on her husband’s policies and personnel choices. 

In this campaign year, a shelf full of books have been written about Trump. Administration insiders (John Bolton) and family members (Mary Trump) have all shined a light on the many flaws of Donald J. Trump.

But as Reaganland reveals, Trump did not rise in a vacuum. He tapped into racial divisions and cultural anxieties that had been carefully cultivated years ago by Reagan and his far-right supporters.  

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177413 https://historynewsnetwork.org/article/177413 0
Unlike the Germans We Have Failed to Recognize and Atone for Our Holocausts

Photo 1940.

 

 

 

As a nation we have not owned up to the grievous, centuries-long harm we have done to Native and African Americans. To what extent are we historians responsible?

 

In late August 2020 a policeman in Kenosha, Wis. fired seven shots in to the back of a Black man named Jacob Blake. For three months--from late May, when another Black man, George Floyd, was killed by a policeman who knelt on his neck--we have seen almost continued protests over how police have treated Black Americans. Shortly after the Blake shooting, various professional athletes, following the lead of the NBA’s Milwaukee Bucks, canceled some scheduled games or practices. 

 

Former NBA hoopster and now Los Angeles Clippers coach Doc Rivers succinctly expressed the grief felt by many Black people: 

 

“We're the ones getting killed. We're the ones getting shot. We're the ones that were denied to live in certain communities. We have been hung, we have been shot. And all you do is keep hearing about fear. It's — it's amazing to me why we keep loving this country, and this country does not love us back. And it's just — it's really so sad.”

 

“This country does not love us back.” Why? Why does systematic racism continue to trouble our country, a century and a half after the end of slavery? One answer is that we have not sufficiently acknowledged, not sufficiently atoned for, the sin of slavery.  

In a previous article I referred to two of the USA’s “most heinous crimes—genocide against Native Americans and slavery.” Now (in my title) I refer to them as “our holocausts.”

 

I do not use the word “holocaust” carelessly. Historians should guard against throwing around such words loosely. But holocaust should not be limited to “the killing of millions of Jews by the German Nazi government in the period 1941–5.” This is only an Oxford dictionary’s second definition. The first is “a situation in which many things are destroyed and many people killed, especially because of a war or a fire.” Long before 1941 the term was used in this way, and even more broadly--see, for example, F. Scott Fitzgerald’s The Great Gatsby.

 

I use “holocausts” here because it seems appropriate for situations “in which many things are destroyed and many people killed.” And it captures best the enormity of what European colonizers and then white Americans have done to Native and African Americans. Moreover, some scholars, such as Russell Thornton in American Indian Holocaust and Survival: A Population History since 1492 (1990) and David Stannard in American Holocaust: Columbus and the Conquest of the New World (1993), have previously applied the word in a similar way. 

To get some idea of the numbers involved, let’s start with the following quote from Jill Lepore’s These Truths: A History of the United States. “Between 1500 and 1800, roughly two and a half million Europeans moved to the Americas [note: not just the USA]; they carried twelve million Africans there by force; and as many as fifty million Native Americans died, chiefly of disease,” most of them because they had no immunity to the diseases passed on to them by those of European ancestry. In discussing this most common means of death, however, historian Roxanne Dunbar-Ortiz emphasizes that colonizers didn’t regard all those deaths as just unfortunate accidents, but from the very beginning intended to eliminate, one way or another, Indian civilization. Stannard stressed the continuing interaction of the diseases with a “deliberate racist purge.”

Although estimates of Native American population in 1492 in what is today the conterminous United States vary widely, Thornton estimated it at about 5 + million, and added that it “declined to but 600,000 by 1800,” and “to about 250,000 by the last decade of the nineteenth century…. This was a population some 4 to 5 percent of its former size.” No wonder he referred to the “American Indian Holocaust.”

Stannard claimed that “the destruction of the Indians of the Americas was, far and away, the most massive act of genocide in the history of the world.” And despite, some twentieth century improvements and an increased Indian population, he still faulted (in 1993) the U. S. government for “its willful refusal to deal adequately with the life-destroying poverty, ill health, malnutrition, inadequate housing, and despair that is imposed upon most American Indians who survive today.”

One government report indicates that Native Americans “have long experienced lower health status when compared with other Americans. Lower life expectancy and the disproportionate disease burden exist perhaps because of inadequate education, disproportionate poverty, discrimination in the delivery of health services, and cultural differences.” In 2020,according to a late-July New York Times article, “there are strong indications that Native Americans have been disproportionately affected by the coronavirus. The rate of known cases in the eight counties with the largest populations of Native Americans is nearly double the national average.”

According to the 2018 Census, Native Americans have a national poverty rate of 25.4%, African Americans 20.8%, Hispanics 17.6%, and Whites 8.1%. Indians also are less educated than other groups. From 2013 to 2017, only 14.3% of Native Americans had a bachelor’s degree or higher, compared to 15.2% of Hispanics, 20.6% of African Americans and 34.5% of Whites. Also, “Native Americans experience substance abuse [including alcohol] and addiction at much higher rates than other ethnic groups.”

With any such woes there is always the question of responsibility. To what extent is it societal or due to personal failings? Although the exact mix is difficult to determine, there is no doubt that we Whites and the governments that have represented our interests bear a heavy responsibility for the historic mistreatment of Native and African Americans.

Of the horrendous conditions facing slaves captured in Africa and sent to the United States much has been written. By 1790, “there were almost 700 thousand slaves in the US . . . which equated to approximately 18 percent of the total population.” By 1860, “there were four million slaves in the South, compared with less than 0.5 million free African Americans in all of the US.” The slaves, who made up about half of the South’s population, worked mainly on large cotton plantations. (For a fictional treatment of slavery, see Harriet Beecher Stowe’s Uncle Tom’s Cabin.)

After the Civil War years and the end of slavery, the Reconstruction era (1865–1877) followed, but it got off to a slow start due to President Lincoln’s assassination and his successor’s presidency. For Andrew Johnson was a former Tennessee governor and, in the words of historian Eric Foner, “incorrigibly racist.” As Lepore writes, “By the winter of 1865–66, Southern legislatures consisting of former secessionists had begun passing ‘black codes,’ new, racially based laws that effectively continued slavery by way of indentures, sharecropping, and other forms of service.” In 1866 the Ku Klux Klan began their decades of White terror against Black people.  

Congress, however, opposed Johnson and Southern racist policies. In April 1866, it overcame a Johnson veto to pass the Civil Rights Act. During the two-term presidency of Ulysses Grant (1869-1877), Reconstruction policies, aided by federal troops stationed in the South, prevailed. According to Douglas Brinkley, during Reconstruction 22 Black legislators served in Congress, and “Blacks were elected to the legislatures of every one of the Confederate states.”  

Yet, as Lepore writes, “Political equality had been possible, in the South, only at the barrel of a gun.” What followed was what Henry Louis Gates Jr. calls “the Redemption era” of 1877 to 1915, when Black enfranchisement ended; the Klan and other Whites terrorized Black people; and Jim Crow laws segregated them from Whites in various places from playgrounds to public transport.  

Thousands of black men were also lynched. As late as 1930, in Marion, Indiana, two young innocent black men were lynched surrounded by white onlookers. In Cincinnati, where I grew up, the swimming pool at Coney Island did not permit Black swimmers until 1961. In the early 1960s, I lived in northern Virginia, where interracial marriage was still prohibited and where I remember picketing a northern Virginia movie theater that still discriminated against Black patrons. 

More recently, Gates recalls other signs of racism: Dylann Roof murdering “the Reverend Clementa Pinckney and the eight other innocents in Mother Emanuel AME Church in Charleston, South Carolina, on June 17, 2015”; the white supremacy rally in Charlottesville, Virginia, on August 12, 2017, “when an avowed white supremacist plowed his car into a crowd of counter-protesters”; and a White racist, in October 2018, unable to get into the “predominantly black First Baptist Church in Jeffersontown, Kentucky,” settling instead for fatally shooting “two African American shoppers” at a local Kroger. 

Between 2018 and today, besides the George Floyd and Jacob Blake episodes, numerous other cases reflecting our ongoing racism could be mentioned, many of them abetted by President Trump, who commented about the white supremacy rally in Charlottesville and opponents of it, “there's blame on both sides,” and there “were very fine people, on both sides.”

The disadvantaged poverty rate and educational attainment of Black Americans have already been mentioned. In addition, the life expectancy of Black males is lower and their incarceration rate much higher than for White men. In 2020, as the government CDC reports, Black and other “racial and ethnic minority groups are being disproportionately affected by COVID-19. Inequities in the social determinants of health, such as poverty and healthcare access, affecting these groups are interrelated and influence a wide range of health and quality-of-life outcomes and risks.”   

Thus, from the days of Columbus to the present, Native and African Americans have suffered grievously. Despite significant gains, including electing our first Black president, the Trump presidency has setback attempts to alleviate the effects of past injustices--see, e.g., here and here. Trump has reinforced White resistance to any efforts toward atoning for past and present racial injustices.

In the early postwar years after the downfall of the German Nazis, Germans also resisted any talk of guilt. In his much discussed 2014 essay, “The Case for [U. S.] Reparations,” Ta-Nehisi Coates mentioned this, but then indicated how the Germans came around to owning up to their Holocaust guilt. More recently (in 2019) Susan Neiman, in her Learning from the Germans: Race and the Memory of Evil, noted that “after the 1963 Birmingham church bombing, James Baldwin said that white Americans share collective guilt for the persecution of black Americans as Germans did for their silence during the Nazi persecution of Jews.” Later Neiman added, “after white nationalist demonstrators [in 2017] screamed ‘Blood and Soil’ in Charlottesville, does the comparison require further argument?”

Having been raised in the U. S. South, and then spent many years in Germany, Neiman details how both areas have (or have not) grappled with responsibility for their racial crimes. While presenting numerous examples of how Germans have attempted to atone--see a recent HNN article for a few examples--she writes that “America's failure to face its past is evident not only in the vicious outbursts of white supremacy that Donald Trump encouraged, but in subtler ways as well.”

Exactly what form this atonement should take is a complicated question, but first we have to own up to our responsibilities. And this is where we historians come in. Some, like Thornton, Stannard, and Gates, have stressed past racial injustices. But many others have not sufficiently done so. More of us have to write and teach U. S. history in such a way that readers and students recognize how grave U. S. injustices to Native and African Americans have been. 

In his Lies My Teacher Told Me: Everything Your American History Textbook Got Wrong, James Loewen writes “our teachers and our textbooks still leave out most of what we need to know about the American past.” This is especially true in pre-college classrooms and regarding the depth of our past racial injustices. Loewen writes, “Almost half the states have textbook adoption boards. Some of these boards function explicitly as censors, making sure that books . . . avoid topics and treatments that might offend some parents.” 

Under such circumstances, teaching history as we should, especially in pre-college courses, can be very difficult. But as I have written previously, avoiding parental disapproval or teaching patriotism should not be our goal. “Historians’ main allegiance should be to truth-telling.” And the truth includes owning up to the grievous harm we have done to Native and African Americans. Only then can we atone for our past misdeeds and create a truly “composite nation,” where we are “made better, and stronger,” not in spite of our “many elements, but because of them.” 

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177406 https://historynewsnetwork.org/article/177406 0
The Religion of Patriotism

 

 

“I pledge allegiance to the flag of the United States of America and to the republic for which it stands, one nation under God indivisible with liberty and justice for all.”

 

I have not said that pledge for many years, but I remembered every word from all the times I had said it as a youth, in school and at other places. The history of that wording reflects the history of American patriotism.

 

The pledge was written by Francis Julius Bellamy (1855-1931) in 1892. He had studied at the Rochester Theological Seminary to become a Baptist minister, following his father, a Baptist minister in Rome, NY. He led congregations in Little Falls, NY, and then in Boston. Bellamy believed that the rights of working people and the equal distribution of economic resources were inherent in the teachings of Jesus. In the labels of the late 19th century, he was a Christian socialist.

 

His cousin Edward Bellamy, whose father was also a Baptist minister, shared Francis Bellamy’s late 19th-century version of liberation theology. He wrote the novel Looking Backward: 2000–1887 (published in 1888), a futurist fantasy in which a Boston man falls asleep in 1887 and wakes up in 2000, when the United States has been transformed into a socialist utopia: all industry is nationalized, working hours reduced with retirement at age 45, and equal distribution of all goods. Looking Backward, along with Harriet Beecher Stowe’s Uncle Tom’s Cabin, was among the great best-sellers of the late 19th century. His sequel, entitled simply Equality (1897), promoted equality for women, and imagined the television, air travel, and universal vegetarianism.

 

The Bellamy cousins wanted radical change, but so did millions at that time. During the last decades of the 19th century, often labeled the Gilded Age, rapid industrialization and capitalism unfettered by regulation led to widespread poverty and unprecedented concentrations of wealth. The top 1% owned half of the nation’s property, and the bottom 44% owned 1%. American industry had the world’s highest accident rate. Socialist and labor movements grew in response.

 

Francis Bellamy preached against the evils of capitalism, offered a public education class entitled “Jesus the socialist”, and was founding vice president of the Society of Christian Socialists. He was forced out of his Boston congregation in 1891. Daniel Sharp Ford, a member of Bellamy’s congregation who published Youth’s Companion, a children’s magazine, hired him to promote Ford’s campaign to put an American flag in every school. To coincide with the 400thanniversary of the voyage of Christopher Columbus in 1892, in coordination with the World’s Columbian Exposition in Chicago in 1893, Bellamy wrote a flag pledge published in Youth’s Companion in September 1892.

 

Bellamy’s pledge read: “I pledge allegiance to my Flag and to the Republic for which it stands, one nation, indivisible, with liberty and justice for all.” He later wrote about his thinking when composing the pledge. The Civil War led to his reference to “indivisible”. Although he was deeply religious, he strongly believed in the separation of church and state, thus including no reference to God. He had been inspired by the French Revolutionary slogan, “liberty, equality, fraternity”, but wrote, “No, that would be too fanciful, too many thousands of years off in realization. But we as a nation do stand square on the doctrine of liberty and justice for all.” He knew that most state superintendents of education were opposed to equality for women and African Americans.

 

Bellamy’s political thinking was among the most progressive of his era, but did not escape the racism inherent in American culture. He argued that the assimilation of non-white “races” into American society would lower “our racial standard”. His and Ford’s and official America’s veneration of Columbus was itself a political statement based on white supremacy and targeted at Italian voters.

 

As a national ritual of patriotism, the pledge has been yanked to the right in the 20th century. In 1924, the conservative leaders of the American Legion and the Daughters of the American Revolution persuaded the National Flag Conference to change “my flag” to “the flag of the United States of America”, despite Bellamy’s opposition. A much more serious distortion was added in 1954 with the words “one nation under God”. Although that change is often attributed to the recommendation of President Dwight Eisenhower, its longer history, as described by historian Kevin Kruse in One Nation Under God: How Corporate America Invented Christian America in 2015, is much more revealing.

 

In response to Franklin Roosevelt’s New Deal, which introduced significant regulation of business and empowered labor unions, giant corporations created a public relations campaign for big capitalism using organizations like The American Liberty League. This secular political campaign was a flop. Jim Farley, chair of the Democratic National Committee under Roosevelt, said, “They ought to call it The American Cellophane League, because No. 1: It’s a DuPont product, and No. 2: You can see right through it.”

 

Corporate America then turned to conservative Christian ministers, literally employing them to link capitalism with Christianity by arguing that the New Deal is evil and capitalism is “freedom under God”. In 1951, Cecil B. DeMille organized a Fourth of July ceremony, backed by the leaders of corporate America and hosted by Jimmy Stewart, carried live over national radio. Their message was that “the American way of life” was Christian individualism expressed in unchecked capitalism.

 

This is the background for the insertion of religious messages into American patriotic rituals. The pledge now asserted that the separation of church and state was un-American. “In God We Trust” appeared on a postage stamp that same year, 1954, and on paper money in 1955. In 1956, it became our first national motto. Since Ronald Reagan began using “God bless America” to end his speeches in the 1980s, that phrase has become a staple in both parties, like the flag pin as patriotic adornment.

 

The claim in the modern Pledge of Allegiance about “liberty and justice for all” is not true today. In the Jim Crow era, when I first learned to recite it, it was an outright lie. The stirring words of the national anthem about America, “the land of the free”, were similarly false. Like the Lost Cause mythology about the Civil War and its aftermath, which was enshrined in the school textbooks I read and taught as American history, these assertions were propaganda for an American society based on white supremacy. Patriotic rituals were designed to indoctrinate young and old with the belief that the racist, sexist, antisemitic America of the 20th century was already perfect, that criticisms of racial injustice or gender discrimination were illegitimate, that America was God’s country and corporate capitalism was God’s handiwork.

 

On Flag Day in 1943, the Supreme Court declared, in West Virginia State Board of Education v. Barnette, that a law requiring schoolchildren to salute the flag and recite the Pledge was unconstitutional. That ruling still stands as settled American constitutional law. Justice Robert H. Jackson wrote then, “If there is any fixed star in our constitutional constellation, it is that no official, high or petty, can prescribe what shall be orthodox in politics, nationalism, religion, or other matters of opinion.” Yet the patriotic rituals we take for granted do exactly that, prescribing that belief in a specific kind of God is patriotic and that freedom and justice for all already exist.

 

Steve Hochstadt

Jacksonville IL

September 8, 2020

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/blog/154402 https://historynewsnetwork.org/blog/154402 0
Dirty Politics, Then and Now

 

 

Donald Trump has brought to American politics and to the presidency a uniquely personal, combative, and mean-spirited style, honed in the cutthroat worlds of high-end real estate, finance, and show business, that relies on personal insults and denigration of foes. In the realm of dirty politics, virtually by default, he has been allowed to rewrite the rules of the game—to the enormous detriment of our country.              

To be sure, personal attacks, mudslinging, and name calling date to the beginning of this republic. “JEFFERSON—AND NO GOD,” nervous Federalists screamed in 1800 in a vain effort to thwart election of the allegedly infidel, pro-French Virginian. Andrew Jackson’s rivals had the temerity to besmirch his beloved wife, Rachel. The twin issues of slavery and secession made the election of 1860 especially ugly. Abraham Lincoln’s enemies depicted him as a “horrid looking wretch,” assaulted him with vicious racist attacks, and claimed that he favored miscegenation.  

Sometimes the mudslinging took on a lighter tone. In 1884, Democrat Grover Cleveland was rumored to have fathered a child out of wedlock, inspiring the ditty “Ma, ma, where’s my pa? Gone to the White House, ha, ha, ha.” At times, it has been downright silly. In 1944, Republicans charged that Franklin Roosevelt had wasted millions of taxpayer dollars by sending a U.S. Navy vessel back across the Pacific to rescue his dog Fala, who, allegedly, had been left on a remote island. Today’s politicians thus follow a time-honored tradition.

But there is also something new in the volume of the attacks, who is purveying them, and how they are purveyed.  The internet gives a platform to political hacks, private citizens, extremist groups, conspiracy theorists, and even foreign governments—think Russia and Iran—to spread misinformation and outright lies with little or no test for truthfulness. This has exponentially increased the amount of personal smears and rendered them more nasty.

More important is the role of the mud-slinger-in-chief. In years past, presidents have generally turned a blind eye to the antics of their zealous underlings. They have left to others the job of responding to attacks. Surprisingly, perhaps, Jackson defended his wife’s honor with words rather than dueling pistols. In another exception, FDR turned the tables on his opponents with a brilliantly sarcastic riposte pronouncing that Fala was a Scotty and his “Scotch soul” would never condone such a waste of money. “Roosevelt’s dog got Dewey’s goat” was the verdict on this incident.   

Trump is the ringmaster of today’s supercharged political circus. The role comes naturally to him. It is a part of his persona and his modus operandi. He is a master of innuendo and hyperbole. He is not troubled by moral or ethical standards, and has no more than a passing acquaintance with the truth. He delights in churning up chaos and seeks to exploit it to his advantage. He early latched on to Twitter, and he spews out with seeming impunity so many tweets that opponents are hard pressed to know how or whether to respond. Mud-slinging seems to be the one part of the job that he truly enjoys. One must grudgingly concede that he has a certain gift for it.

He ventured into dirty politics before he ran for office by taking up the notorious “birther” theory that Barack Obama was not eligible for the office he already held. In 2016, presidential candidate Trump dreamed up belittling nicknames for his primary opponents: “Low Energy Jeb” (Bush); “Little Marco” (Rubio); “Lyin’ Ted” (Cruz). He hawked the preposterous insinuation that Cruz’s father was involved in the JFK assassination. In the campaign itself, he labeled his opponent “Crooked Hillary” for her alleged misuse of government emails and led the chant “Lock Her Up.” His slurs can be blatantly racist and sexist: “Pocahontas” for Senator Elizabeth Warren, for example, and the unspeakably crude remarks directed at presidential debate anchor Megyn Kelly after she dared challenge him.   

Seeking reelection, he has picked up where he left off, shifting the low energy moniker to “Sleepy Joe” and “Slow Joe” Biden and questioning his opponent’s mental acuity. Childishly, he insulted Vice-Presidential nominee Kamala Harris by knowingly mispronouncing her first name. He launched a half-baked birther theory for her as well and piled on with a barrage of misogynist slurs: “phony Kamala,” “nasty,” “angry,” “madwoman.” The Trump family low--so far--has been the “like” (later removed) that son Eric attached to a tweet calling Harris a “whorendous pick.” Older son Don Jr.’s labeling of Biden as the “Loch Ness monster of the swamp” ranks a distant second. Junior has also hinted that Biden may be a pedophile.

It could be argued, of course, that Trump is simply being honest, that he is merely bringing into the open stuff that usually remains behind the scenes. But that is too easy. What the president of the United States says makes a difference. He demeans himself, if that matters. He demeans the office of the presidency, one of the most prestigious positions in the world. He demeans this nation in the eyes of its own citizens and the world. Actions have consequences, and Trump’s tirades can incite his followers and provoke his opponents to act similarly, revving up the already rampant divisiveness in this country, sparking hatred and violence, and even causing death, as with the killing of two people by a teenage Trump enthusiast in Kenosha, WI. “The unthinkable has become normal,” Senator Bernie Sanders has rightly noted.  

Shamelessly, the Republicans either won’t or can’t rein in their president. The only way to get rid of his toxic influence is to vote him out.   

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177409 https://historynewsnetwork.org/article/177409 0
“We Are Ourselves”: Review of For Workers’ Power: The Selected Writings of Maurice Brinton  

 

For Workers’ Power: The Selected Writings of Maurice Brinton, David Goodway, ed. (AK Press, 2020)

 

By day, Christopher Agamemnon Pallis was a distinguished British clinical neurologist, scion of a prominent Anglo-Greek family that included poets, soldiers, businesspeople, a botanist, and an authority on Tibetan Buddhism. The rest of the time, he was “Maurice Brinton,” one of the leaders of Solidarity, a breakaway British Marxist group, and the author of a stream of polemics, reporting, and historical works that helped move much of the left from hard Marxism to libertarian socialism—and, in some cases, anarchism. Chris Pallis’s pseudonymous writings, along with those of other maverick materialists, freed the left from its own past and is one of the reasons that today’s left is typified not by party cadres and excruciating sectarian quarrels but by Black Lives Matter, Antifa, anarchists, Mexico’s Zapatistas, and indigenous movements in Latin America and Asia.

For Workers’ Power: The Selected Writings of Maurice Brinton is an expanded edition of a collection that first appeared in 2005. It provides more than an introduction to this shadowy but important figure’s work; it includes just about everything important that he wrote under the Brinton byline. British scholar David Goodway, who edited the collection, provides a compact account of Pallis’s life (1923-2005) and unusual career, including the time he was outed under his original pseudonym, “Martin Grainger,” and nearly lost his position as consultant in neurology at the Hammersmith Hospital, as well as an introduction that neatly traces his intellectual development and accomplishments.

Brinton was an excellent writer, scholar, and eyewitness journalist; For Workers’ Power includes political commentary and critique, theoretical writings, his groundbreaking historical work on the Russian Revolution, and powerful accounts of key events in postwar working-class politics, including the May 1968 uprising in Paris and the failed Portuguese revolution of 1974. Two things make Brinton (as I’ll refer to him from now on) and the Solidarity circle interesting and important today: their very serious effort to understand what went wrong with Soviet Russia and the Marxist Left in the decades after the Russian Revolution, and their commitment to finding ways that revolutionary socialists could adjust to the new economic and social realities of the postwar era.

To put this in context, Solidarity and Brinton were part of a generation of postwar thinkers and activists who came of age as Marxists (Brinton joined the Communist Party at university in 1941), then gradually rejected many of the basic assumptions of Marx as flawed or outdated. Some of the biggest names on the Continent were Cornelius Castoriadis (many of whose works Brinton translated into English), Claude Lefort, and Jean-François Lyotard.

Like them, Brinton realized that the kind of proletarian politics that Marxists had practiced in the 19th and the first half of the 20th centuries wouldn’t work anymore. Ethnic and racial issues were becoming more important; the western working class had grown more prosperous, then splintered and moved to the right (in many cases) as the industrial economy evolved; and women’s and LGBTQ rights (among others) were becoming immediate issues. How, he wanted to know, can you forge a powerful workers’ movement of the left in a world where the sharp definitional lines that Marx drew, suddenly were blurring?

Brinton could see these developments coming years before terms like “neoliberalism,” “globalization,” and computerization were common. “New productive tech­niques have led to greater division between the producers,” he wrote in 1961. “Thousands of jobs and profes­sions formerly requiring skill and training and offering their occupants status and satisfaction have today been stripped of their specialized nature. Not only have they been reduced to the tedium and monotonous grind of any factory job, but their operatives have been degraded to simple executors of orders, as alienated in their work as any bench hand.”

“Marxists,” he added, “would be bet­ter employed analyzing the implications of this important change in the social structure rather than waving their antiquated economic slide-rules.” The French crisis of 1968 involved university students who were far from starving and factory workers at Renault and Sud-Aviation who were among the best paid in the country. What was driving them, Brinton asked, and what would drive future uprisings? All working people, and not just those who Marx classified as “workers,” were increasingly cut off from the management of their own lives, and felt it, even if they didn’t always know how and couldn’t find a way to pull together in opposition to the new post-industrial landscape. 

“We live in neither the Petrograd of 1917 nor the Barcelona of 1936,” Brinton wrote in 1972. “We are ourselves: the product of the disintegration of tra­ditional politics, in an advanced industrial country, in the second half of the 20th century. It is to the problems and conflicts of that society that we must apply ourselves.”

By and large, the supposed leadership of the working class wasn’t much help, because it didn’t understand, or didn’t want to understand, what was changing. Time and again—in Hungary in 1956, in Belgium during a 1960 general strike, in France in 1968, in Portugal in 1974—a potentially revolutionary movement, bringing together industrial and agricultural workers, students, and other dissidents was discouraged not by a right-wing government but a sclerotic Communist Party that was suspicious of any formation that didn’t fit its ideological preconceptions. Social democratic and labor parties in Europe and North America play a similar role in the neoliberal present.

Even in the days of the Russian Revolution, Brinton argued, Marxists were wrong to think that working people’s lives were entirely defined by economic circumstances, and many of their mistakes sprung from their failure to understand the other aspects of human existence. In an approving article about the renegade psychoanalyst and sometime Marxist Wilhelm Reich, Brinton quotes Reich’s 1934 pamphlet, What is Class Consciousness?, which was becoming a popular New Left text. Mass consciousness, Reich wrote, is “made up of concern about food, clothing, family relationships, the possibilities of sexual satisfaction in the narrowest sense, sexual pleasure and amusements in a broader sense, such as the cinema, theatre, fairground entertainments and dancing.” It is concerned “with the difficulties of bringing up children, with furnishing the house, with the length and utilization of free time, etc.”

The conclusion, for Brinton, is that a revolutionary party can’t create revolutionary class consciousness; it has to come from the bottom up, from working people themselves and their understanding of their lives. What’s left of Marxism, then? Not much, but it’s important: workers’ control of production. Socialism isn’t about who owns the fields, the factories, and the workshops, but who controls them. States can expropriate real estate, factories, and data sets—as was done in Russia in 1917 or China in 1949—but if it puts them in the charge of a professional managerial class, that’s not workers’ control. 

Brinton stakes out an uncompromisingly radical position that would be familiar to many anarchists today: socialism and economic democracy can’t be achieved through reform or even a revolution within the State, like the one the Bolsheviks pulled off. They require a “total social revolution,” including workers’ management of production. A “meaningful” social revolution only comes about “when a large number of people seek a total change in the conditions of their existence.”

Just as a practical matter, simply taking over the government, and even the levers of the economy, is not enough. “No island of libertarian communism can exist in a sea of capitalist production and of capitalist consciousness,” Brinton wrote; any attempt to do so will revert to a capitalist model, sooner rather than later.

Does this make Brinton an anarchist? He lamented that Marxist revolutions all too often produced authoritarian regimes, and he rejected scientific socialism—the theory that the study of historical trends can predict their future development. “Genuine creation is the act of producing that which is not totally implicit in the previous state of affairs,” he wrote. “By its very nature it defies the dictates of predetermination. For those who see history as the unfurling of a dialectical process which leads inevitably ‘forward’ towards a particular brand of ‘socialism’ … there is no real history. There are just mechanisms.”

Those certainly sound like the words of a contemporary anarchist; “Brinton’s politics are fully anarchist,” Goodway firmly asserts. But I’m skeptical. While he rejected much of Marxism, Brinton still relied on Marxist terminology and categories of thought. The few references to major anarchist thinkers like Proudhon, Bakunin, and Kropotkin suggest that he never seriously grappled with their ideas. A better way to view Brinton—and Chris Pallis—is through the lens of the history that succeeded him. 

Today, much of the left is more concerned about building social movements than political parties or with seizing power. Anarchists, libertarian socialists, Indigenous organizers—even Black Lives Matter—would likely agree on the need to keep agency in the hands of these movements, not cede them to a clique of politicians or a revolutionary conspiracy. That’s a demanding assignment; Brinton was unsparing at defining the terms and exposing the pitfalls.

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177407 https://historynewsnetwork.org/article/177407 0
Is This the Most Important Election?

 

 

 

The conventions are now behind us, and the post Labor Day period is often considered the launch of the full presidential campaign season. As in most election seasons, this one is being cast in apocalyptic terms by the two parties. “Do not let them take away your democracy,” former President Obama urged during his convention speech. “This is the most important election in our history,” President Trump countered.

 

Is Trump right? Is this the most important election in our history? Is democracy on the ballot, as Obama claimed? Or is that just a conceit, something we say every four years? Perhaps a look at some other crucial elections in our history will help to enlighten us.

 

The election of 1800 established the first peaceful transfer of power in the United States, one that almost didn’t happen. Without this, it is hard to see how America would have become a democracy. The election featured two men who were old friends and now political rivals: John Adams and Thomas Jefferson. They had faced each other in 1796, with Adams prevailing. Jefferson, who came in second, become vice president based on the original wording of the Constitution, under which electors voted for two people. The one with the most votes became president, while the runner up became vice president.

 

The two had a brief flirtation with bipartisanship at the beginning of Adams’ term, but things soon fell apart over lingering differences in the direction the new nation should take including over foreign policy. Relations with revolutionary France had fallen apart over the Jay Treaty, which was seen as pro-British. Adams ended up in a quasi war with France and his Federalist Party passed a series of bills known as the Alien and Sedition Acts. The Sedition Act was clearly pointed at Jefferson and the Republicans, making it illegal to publish “false, scandalous, and malicious writings against the United States.” Partisanship had spun out of control by the late 1790s, and actual violence between the two sides, both in Congress and in the streets, broke out.

This was the setting as the election of 1800 unfolded. Surprisingly, Jefferson and his vice-presidential candidate, Aaron Burr, tied with 73 electoral votes, while Adams received 65 electoral votes. The election was thrown to the House, but the Federalists began to consider extraconstitutional means to deprive Jefferson of the presidency. Jefferson then warned Adams that this “would probably produce resistance by force and incalculable consequences.” Ultimately Jefferson emerged as the winner after thirty-six ballots. While Adams peacefully gave up power, he refused to attend Jefferson’s inauguration. As David McCullough has written, the “peaceful transfer of power seemed little short of a miracle…and it is regrettable that Adams was not present.”

 

The election of 1860 took place when the future of the nation was literally at stake. Abraham Lincoln, a man who had risen from humble circumstances, had become one of the leaders of the new Republican Party in the 1850s. Lincoln wanted to stop the spread of slavery into the new territories that had been obtained during the Mexican-American War. His main rival for power, Stephen Douglas, believed that each territory should vote on whether to allow slavery, that popular sovereignty was the answer. Lincoln’s response is instructive. “The doctrine of self government is right---absolutely and eternally right—but it has no just application” to the issue of slavery, which Lincoln believed was morally wrong. 

 

Lincoln, the dark horse candidate for the Republicans, emerged on the third ballot at the convention in Chicago. Douglas won the Democratic Party nomination, but it was a pyrrhic victory. The Democrats from the South had walked out of the convention and nominated Vice-President John C. Breckenridge as their candidate. To make matters worse, a fourth candidate joined the fray as John Bell of Tennessee ran for the Constitutional Union Party. Ultimately Lincoln prevailed in the election, winning solidly in the North and west, but barely gaining any votes in the South. By mid December, South Carolina seceded from the Union and the Civil War began in April when southerners fired on Fort Sumter in Charleston harbor.

 

The question at the start of the war was would the Union survive, but ultimately the next four years of Civil War would lead to the elimination of slavery in the United States and “a new birth of freedom” for the nation, as Lincoln framed it at Gettysburg. The question of who can be an American, of who is part of the fabric of our nation, continued to evolve. During a brief period known as Reconstruction, America began to live up to its founding creed, that all are equal. Amendments were added to the Constitution which formally ended slavery, provided for birthright citizenship and equal protection under the law, and allowed black men to vote. But the era was just a blip in our history, and the era of segregation and Jim Crow laws soon emerged and would not be removed until the Civil Rights protests of the 1960s.  

 

The 1932 election occurred against the backdrop of the Great Depression. Herbert Hoover had been elected in 1928 as the “Great Engineer.” He had made a fortune as a geologist in mining and then had become involved in public affairs. “The modern technical mind was at the head of government,” one admirer wrote of the president. Hoover has often been cast as being a disciple of laissez faire when it came to the economy, but he in fact believed in “government stimulated voluntary cooperation” as historian David Kennedy has written. He took many actions early in the crisis, like getting businesses to agree to maintain wages and urging states and local government to expand their spending on public works. But Hoover was limited by his own view of voluntary action and could never bring himself to use the federal government to take direct action to fight the depression.

 

Franklin Delano Roosevelt had no such qualms. A rising politician in the early part of the 20th century, FDR had been struck down by polio in 1921. It made him a more focused and compassionate man who identified with the poor and underprivileged, as Doris Kearns Goodwin argues. Roosevelt began with some bold pronouncements, talking about “the forgotten man at the bottom of the economic pyramid” and of the need for a “new deal for the American people.” Those two words, which James McGregor Burns has written “meant little to Roosevelt and the other speech writers at the time,” soon came to define Roosevelt’s approach to the depression. FDR swept to victory, winning almost 60 percent of the popular vote and 42 of the then 48 states. The election established that the government had a responsibility for the well being of the people of the nation. FDR would eventually adopt the Four Freedoms as part of his approach, which included the traditional support for freedom of speech and worship, but also freedom from want and fear.

 

The 2020 election features each of the elements that made these prior elections so important. Democracy and the peaceful transfer of power are clearly on the line. Donald Trump has already called into question the fairness of the election, especially over mail in voting, and has begun once again to claim that he will lose the election only if it is rigged. One can imagine Trump refusing to leave office if he loses a close election to Joe Biden. 

 

The unity of our nation is also as stake. Trump “is polarization personified” who has “repeatedly stoked racial antagonism and nativism,” political scientist Suzanne Mettler and Robert C. Lieberman write. Trump has even been encouraging violence on the part of his supporters over Black Lives Matter protests.  “The big backlash going on in Portland cannot be unexpected,” Trump tweeted regarding the violence perpetrated by his supporters. 

 

Prior to COVID-19, Trump’s economic and tax policies favored the already wealthy and contributed to an ever-worsening growth in income inequality. To their credit, the president and his party supported an aggressive initial stimulus package to assist businesses and individuals. The extent to which the Republican Party will continue to support aggressive government action in response to the economic damage caused by the coronavirus, in order to aid the middle and working classes rather than the wealthy, is an open question. 

 

President Donald Trump may indeed be right, this is the most important election in our history. Just not for the reasons he believes.  

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177414 https://historynewsnetwork.org/article/177414 0
The Roundup Top Ten for September 18, 2020

The Deep Roots of Disdain for Black Political Leaders

by Carole Emberton

From Thomas Jefferson's writings, through the proslavery argument of the middle of the 19th century, the overthrow of Reconstruction, and the Jim Crow era, American politics has been influenced by the racist idea that Black people were incapable of exercising leadership in a democracy.

 

Who Owns the Evidence of Slavery’s Violence?

by Thomas A. Foster

A lawsuit demands that Harvard University give custody of famous images of enslaved men and women--taken without consent by a biologist seeking to demonstrate white supremacy-- to the subjects' descendents. A Howard University historian agrees, putting the images in context of other intimate violations endured by enslaved persons. 

 

 

The Long History Behind Donald Trump’s Outreach To LGBTQ Voters

by Neil J. Young

Gay Republicans emerged as a political force in response to both radical leadership in the gay liberation movement and the rise of evangelicals as a force in the Republican party. Today they may have to decide which fight is more important. 

 

 

Lampooning Political Women

by Allison K. Lange

Backlash against women's emancipation in the nineteenth century took to the most potent social media of the day--political cartoons--to decry feminism as a threat to civilization itself. 

 

 

The Dark Side of Campus Efforts to Stop COVID-19

by Grace Watkins

While colleges have a legitimate interest in suppressing virus transmission on campus, it is dangerous to expand the surveillance powers of campus police. 

 

 

The Forgotten History of the Radical ‘Elders of the Tribe’

by Susan J. Douglas

The Gray Panthers fought for the civil rights, social services and respect denied to older Americans. But they did so by challenging inequality in ways that sought alliances instead of antagonism between young and old. 

 

 

Why Do Women Change Their Stories Of Sexual Assault? Holocaust Testimonies Provide Clues

by Allison Sarah Reeves Somogyi

Despite the horrific frequency of sexual abuse of women during the Holocaust and during World War II, stigmas attached to victims encouraged survivors to self-censor in their testimonies. The historical record may help to understand the behavior of victims today.

 

 

American Democracy Is in the Mail

by Daniel Carpenter

The Postal Service has been a circuit of information vital to democracy, a non-exclusionary employer, and a service connecting all communities in the nation. It's also been a tool of conquest and voracious capitalism. For good and ill, the history of the USPS is the history of America. 

 

 

Why ‘Glory’ Still Resonates More Than Three Decades Later

by Kevin M. Levin

The film based on the story of the 54th Massachusetts Volunteer Infantry is streaming on Netflix. Kevin Levin suggests that despite the narrative license taken, the film puts the story of Black freedom fighters and the question of emancipation at the center of the story of the Civil War. 

 

 

Where Kamala Harris’ Political Imagination Was Formed

by Tessa Rissacher and Scott Saul

A Black cultural center in Berkeley introduced Kamala Harris to activism and the connections between culture and politics. 

 

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177403 https://historynewsnetwork.org/article/177403 0
The Pentagon is Missing the Big Picture on "Stars and Stripes"

Editorial Room of Stars and Stripes, WWI

 

 

 

In February, the Pentagon proposed slashing funding for the famed soldiers’ newspaper Stars and Stripes, a story that roared back into the news in September after its publisher reported he had been ordered to halt publication by the end of the month. By the morning of September 4th, President Trump tweeted out his insistence that the paper would not be closing, though the Senate has not yet voted on a defense appropriations bill to resolve this issue. While veterans have raised important concerns about the elimination of an important journalistic voice independent of military officials’ control, administrators framed the paper’s closure as a budgetary issue to save $15.5 million - a seeming pittance in a department collecting over $705 billion in federal funding. If budgetary issues are truly the concern, I’d propose considering the paper’s history, which demonstrates how the paper actually can and does function as an important part of the public-private partnerships driving the country’s economy in connection with its journalistic mission.

 

In recent days, many articles have mentioned the paper’s roots during the Civil War, but few have described crucial developments during the First World War, when Guy T. Viskniskki, a Spanish-American War veteran and New York area journalist for the Wheeler Syndicate, argued a new paper describing the wartime experience through the eyes of rank-and-file servicemen could raise morale without becoming a form of propaganda. While training at Camp Lee outside of Richmond, Viskniskki established a newspaper for the community, one of many camp newspapers funded by the YMCA’s War Work Council. However, when he reached Europe in November 1917, Viskniskki dreamed of establishing a paper free from the oversight of his commanding officers and believed his knowledge of censorship regulations allowed him to effectively follow the rules placed on war correspondents. While Viskniskki was proud to highlight his paper’s financial successes and independence, the Intelligence Section provided the first 25,000 francs he needed to begin publication in January 1918. Viskniskki worked hard to push the paper beyond the narrow, divisional or company focus of other soldiers’ papers, rejecting calls to spin-off specialty publications for the Services of Supply because he believed a mass paper created a sense of unity spanning front and rear lines. He found his mass audience by providing troops with news of the war, tales from the home front including sports coverage, and comics, all written in a relatable, informal style. At its peak circulation near the armistice in November 1918, staffers printed 526,000 copies per issue and distributed them to readers on both sides of the Atlantic. When the paper closed shop in June 1919, the paper had even turned a profit (though Viskniskki was disappointed it was turned over to the US Treasury rather than distributed to French war orphans).

 

Stars and Stripes staff adjust linotype machines, WWI

 

 

Viskniskki financed his paper primarily through advertisements, securing free assistance from A.W. Erickson in New York. The paper charged perspective buyers the very low rate of one dollar per inch, raising it to six dollars when the cost of newsprint grew more cumbersome after circulation rose over 400,000. By the third issue, Viskniskki packed the paper with ads for products including Boston Garters, Colgate dental cream, Lowney’s chocolates, Wrigley’s chewing gum, 3-In-One oil, Mennen shaving cream, Fatima cigarettes, and Auto Strop razors. These advertisements were part of a broader modernization of military culture that introduced servicemen to new consumer products. For example, military regulations required troops to maintain clean-shaven faces to create a firm seal for their gas masks, thus requiring them to carry their own shaving kits and creating a market for the supplies advertised in their paper; a 1919 survey by the J. Walter Thompson advertising firm found that 30 percent of safety razor users learned of the product in the army. Similarly, army officers estimated that 50 percent of conscripts had not brushed their teeth on a regular basis, leading them to order over 4 million toothbrushes for troops who purchased toothpaste at local canteens or post exchanges.

 

Staffers for The Stars and Stripes also developed tools to effectively distribute their paper amid harsh wartime conditions. The paper secured subscribers by collecting a large cash payment up front, making it easy for delivery agents to simply drop off the paper rather than manage individual subscribers’ accounts. Viskniskki relied on staffers from Hachette, a leading French publishing house, to carry papers from train stations to troops even when they were under fire. The paper’s staff also acquired ninety-one cars that allowed field agents to deliver the paper to most remote regions where their readers served. Such decisions to develop an internal distribution service predated similar efforts by leading retailers by several years, as major mail order retailer Sears only developed their own trucking service in the early 1920s. The Stars and Stripes’ distribution network was so effective they approached the Red Cross and YMCA to assist them with their pre-USO era responsibilities of delivering treats and entertainments to troops across service areas.

 

Viskniskki was far from the paper’s only reporter, and its large and successful staff both reinforced the paper’s reputation for independence and established contacts that would further its commercial impact long after the war. While Viskniskki conducted the reporting for the first few issues primarily by swiping official cables from the censorship office, he quickly expanded his team by identifying experienced journalists who had difficulty fitting in with their units because of their writing habits. These included Harold Ross, whose commanding officer in a railway engineering unit forwarded Viskniskki many articles Ross had drafted alongside a plea to remove the man he considered the unit’s headache; New York Times writer Alexander Woollcott, who Viskniskki knew wanted to cover the front lines; New York Tribune sports reporter Grantland Rice who transferred from artillery work mere issues before Viskniskki decided to cancel the sports page amid Americans’ increasing combat responsibilities; fellow Tribune scribe Franklin P. Adams who opined on military life in his “The Listening Post” column; and Philip Von Blon, the enterprising reporter who developed sources within the SOS and broke the Harts uniform story. These writers formed a close social circle during the war and after, particularly after Ross founded The New Yorker and recruited Woollcott as a writer, palling around with Adams in their famed Algonquin Round Table meetings, while Von Blon returned home and took a job as managing editor of American Legion Monthly. Viskniskki himself declined to capitalize on the Stars and Stripes brand he created, turning down a $300,000 offer to establish a paper in the United States. However, other staffers risked Viskniskki’s ire and marketed their connections to the paper when founding veterans’ publications such as The Home Sector. 

 

While short-lived, the World War One-era Stars and Stripes demonstrated the features that would become hallmarks of the paper when it resumed publication in 1942. The paper’s editors relied on funding from the government and its affiliates to open its doors, and it contributed to the country’s commercial development by inspiring new distribution ideas and familiarizing troops with new products for future consumption. Such factors clearly demonstrate why Congressional leaders are right to offer continued support for a paper that has shaped the country’s economy and culture for over a century.

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177333 https://historynewsnetwork.org/article/177333 0
Native Actors Outside the Frame

Harry Smith, under the name of Jay Silverheels, was a Mohawk actor who famously portayed Tonto on The Lone Ranger. 

 

 

 

Remember Tonto and the Lone Ranger? You might recognize my book cover with Harry Smith, aka Jay Silverheels, ready to grab his gun. I am a citizen of Cherokee Nation and an Assistant Professor of History and Native American and Indigenous Studies at Indiana University. My book has just been released entitled, Picturing Indians: Native Americans in Film, 1941-1960.

 

In this book, I draw attention outside the frame of the films we watch from this era and remind readers that the movie sets were workplaces. Although I was interested in all aspects of work on the sets, including makeup artists, costumers, and the food prep people, just to name a few, I look in particular at those playing Native American characters, especially Native people playing Native characters. This comprises both actors and extras. With actors, I am invoking union guidelines around speaking parts and time on screen, and Native actors never took the lead role. This meant that supporting or minor parts were the highest-level Native workers achieved at the time. 

 

Some of these men included Harry Smith, or Jay Silverheels, who graces the cover of Picturing Indians. Harry was a Mohawk man from the Six Nations Reserve in Canada, as is Gary Farmer, the actor who appears in many films, including Powwow Highway and Dead Man. Smith had over 100 film credits, with a commanding film presence even in the limiting roles he was offered. In spite of working non-stop for decades, generating tremendous wealth for the many studios where he worked, Harry struggled financially his entire life. In LA he rented a one-bedroom apartment near the corner of Sunset and Bronson. He passed away with massive legal debts, suffering from medical malpractice and dragging himself through a legal battle until the day he passed.

 

Like Harry Smith, Daniel Simmons, a member of the Yakama Nation, used Chief Yowlachie as a name that would define and present him as a Native American to casting agents and the American public. He too has over 100 film credits, but as far as I know never owned a home in Los Angeles. In fact, he rented a granny flat in East LA where he received his meager checks from the studio. 

 

There are several other Native men who worked regularly in supporting roles and I go over this in the book, but let’s move onto those who worked as extras. Again, I use union terminology, emphasizing that extras are people working in front of the camera with no lines. There are hundreds, perhaps thousands of Native people who appeared in movies of the 1940s and 1950s according to the studios’ archives. Sometimes I know their names, such as Plain Feather, a Crow man who worked as an extra in Warpath and Donald Deer Nose, also Crow, who worked in Warpath as well. Often extras went unidentified in photographs taken by the studio. Only from archival materials would I know, for instance, that the woman in a studio photograph is Diné or Navajo. Perhaps now that the book is out, I will be able to identify her and stop referring to her and others as anonymous extras. 

 

To be clear, Picturing Indians is a behind the scenes look at movies of the 1940s and 1950s. Initially I believed the movies and the film sets ran in parallel tracks, separate and uninformed by each other. Yet the more I looked at the archival materials alongside the films themselves, the more I saw just how oppositional they are. What I mean by that is the films recreate American history in a particular way, usually with complicated plot devices for white characters, extremely simplistic ones for Native characters, and the constant of Indian violence and white innocence. Yet the materials from the sets where Native people worked tell something very different.

 

For instance, an image from the set of Drum Beat of two Apache women being photographed taking a photograph of Charles Bronson in Indian costume, leaning back seductively in a chair, seems to be saying something about Native women finding Charles Bronson attractive. Yet this film is about hundreds of white soldiers and volunteers hunting down and surrounding Modocs then executing their leader.

 

Or another example comes from an image of an Apache male extra taking a photo of a beaming William Holden on the 1953 set of Escape from Fort Bravo. A studio photographer photographed this moment, staged or spontaneous, which seems to indicate pleasure and camaraderie, yet this film made by MGM tells a story about deeply divided northern and southern whites during the Civil War, who come together when faced with violence from Apaches. 

 

The last example I will give of this disjuncture and perhaps the most stunning comes from the set of Far Horizons in 1955. We see tribal chairperson Herman St. Clair with a number of Eastern Shoshone men offering Donna Reed a fishing permit, invoking their sovereign fishing rights to give her the right to fish on their waters. They have maintained these rights by way of the Fort Bridger Treaty of 1868. Yet St. Clair took this action, perhaps nothing more than a stunt, on the set of a film that has nothing to do with tribal sovereignty. Instead the film tells the story of settler colonialism with Lewis and Clark as heroes. 

 

There are so many moments I wish more people knew about, especially those who know and love these movies. But Picturing Indians maintains a steady analysis of the exploitation of Diné and their land by the movie industry. Monument Valley is Navajo land, yet it came to embody the West and the filmic West through the economic exploitation of the Diné. I document this quite precisely in the book in terms of how they were paid by John Ford and other filmmakers of the era. To better understand Diné today I would strongly recommend several movies for people to watch such as The Return of Navajo Boy, Basketball or Nothing or Drunktown’s Finest.

 

But more than anything, I want my readers to see that Harry Smith and other Native American actors gave Americans tremendous entertainment value with very little in return. Warner Bros. owns nearly all of the images in the film archives. Yet they gave permissions for me to reproduce them, then revoked that permission at the last minute as we went to press. My publisher pulled the cover image of Harry Smith for the cover from public domain. Smith’s family earns nothing from this and has no rights to the image. Yet the studios possess the rights and refuse to allow anyone to reproduce the vast numbers of images they hold of Native people who worked in film. Harry Smith made the studios a small fortune, but died with just about nothing. 

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177330 https://historynewsnetwork.org/article/177330 0
The Garbage Troop: Segregation, Primatology, and Republican Rhetoric

Postcard of Charleston High School, 1910. Postcard Collection (UALR.PH.0105), UA Little Rock Center for Arkansas History and Culture.

 

 

 

If you watched the Republican National Convention at all, you were probably struck by the expressions of fear that permeated the proceedings—namely, the fear that any failure to re-elect Donald Trump as president of the United States would result in the collapse of the American experiment, if not the dissolution of civilization itself. Words to that effect were spoken many times over; Trump himself, accepting the nomination, said, “This election will decide whether we SAVE the American Dream, or whether we allow a socialist agenda to DEMOLISH our cherished destiny.” (The capitalization is original to the transcript.) But can we take their expressions of concern at face value, or does the Republican Party’s rhetoric conceal another fear entirely?

 

If we roll back several decades, we find that those who opposed school desegregation similarly warned of the collapse of civilization if black and white students were allowed (or “forced”) to study in the same buildings. But reality proved them wrong. The very first school district in the former Confederacy to desegregate following the 1954 Brown v. Boarddecision was that of Charleston, Arkansas, although it did so rather secretly. This small school district in the western part of the state had been paying to bus black students to Fort Smith for their education, and so the decision to desegregate was as much economic as it was moral. Local leaders did not seek to attract national attention to the fact that eleven African American students were admitted on the first day of classes on August 23, 1954, and desegregation went off with very little opposition.

 

The first reported school desegregation in the former Confederacy occurred in Fayetteville in the northwestern corner of Arkansas. Seven black students entered Fayetteville High School in September 1954. The only opposition was one lone woman with a placard, despite the district having announced publicly their intentions. And although black students did report instances of harassment and the use of racial slurs during the school year, a certain camaraderie seem to have formed between black and white students. Many local schools refused to play the integrated Fayetteville football team, and when Coach Harry Vandergriff gave his players the option of benching black players or forfeiting the games, they chose the latter.

 

By the following year, however, segregationists had apparently had enough with the success of school desegregation efforts, drawing a line in the sand at Hoxie, a small town in northeastern Arkansas. As at Charleston, school district officials pursued desegregation to save the money of having to bus black students to the city of Jonesboro, but also because such an act was, in the words of Superintendent Kunkel Edward Vance, “right in the sight of God.” And so on July 11, 1955, all school facilities at the local white school were opened up to black children. Everything seemed to be going okay for the next weeks, but later that month, Life magazine published a pictorial essay highlighting the success of desegregation and showing white and black children playing and studying together. Soon, outsiders began flooding into the town, raising the threat of violence. Although they were not successful in rolling back desegregation at Hoxie, they developed the techniques of harassment and intimidation that would come into play two years later in the much-publicized desegregation of Little Rock Central High School, when the federal government was forced to call out the National Guard to restore order after nine black students attempted to enter the school. Finally, segregationists could point at the violence they created and assert, with much more confidence, that letting black and white students study together would disrupt civilization as we know it.

 

Segregationists insisted that difference between black and white was unalterable and would necessarily produce violence conflict if the proper hierarchy were not maintained. Interestingly, as the battles over school desegregation were raging in America, the study of primatology was coming into its own, giving us a glimpse into the deeper realities of human nature. The first overview of the subject was Irven DeVore’s 1965 Primate Behavior: Field Studies of Monkeys and Apes. In this book, DeVore insisted that aggression in savannah baboons “is an integral part of the monkeys’ personalities, so deeply rooted that it makes them potential aggressors in every situation.” But later studies called this “fact” into question. In the 1980s, Robert M. Sapolsky was studying a particular baboon troop when a neighboring troop began foraging at the garbage pit of a nearby tourist lodge, which provided a wealth of high-energy foods, such as discarded beef and chicken and sweets. Soon, certain members of Sapolsky’s troop began going over to this garbage pit every morning to fight over these new resources. As Sapolsky writes in Behave: The Biology of Humans at Our Best and Worst, these baboons typically “were male, big, and aggressive. And morning is when baboons do much of their socializing—sitting in contact, grooming, playing—so going for garbage meant forgoing the socializing. The males who went each morning were the most aggressive, least affiliative members of the group.”

 

However, some of the meat over which these baboons were fighting came from tubercular cows, and soon TB wiped out not only most of the troop that had found the garbage pit, but also those males from Sapolsky’s troop who were going there. He returned to his troop some years later and discovered that the culture had changed radically. Not only were levels of aggression lower across the board, but “there was minimal displacement of aggression onto innocent bystanders—when number three lost a fight, he’d rarely terrorize number ten or a female.” And the social culture was being transmitted. Adolescent males typically leave their own troop, and those who entered this one were greeted with affiliative overtures by the less-stressed females, such as grooming or sexual solicitation, much earlier than in other troops, and soon assimilated to this new culture themselves.

 

What does all of this talk of school desegregation and primatology have to do with the rhetoric coming out of the RNC? Simply this—that our culture can change in egalitarian ways without threatening our survival. The previously thinkable can become simply an everyday reality for us, and quickly, too. Black and white children can attend school together without conflict, even in some godforsaken corner of a state not known for its progressive worldview. Those appealing to the power of tradition must create conflict in order to prove their point. Natural hierarchies are anything but; they are not written in our DNA. The study of more “primitive” species illustrates that fact.

 

In other words, Donald Trump and his Republican Party are not afraid that Joe Biden’s election will destroy America. They’re afraid that it won’t. They’re afraid that Joe Biden’s election won’t herald the end of our American experiment in a widening gyre of violence and chaos. They’re afraid that a turn toward egalitarian thinking won’t unravel the survivability of our troop and thus herald our doom. They’re afraid that equality might prove a strength rather than a weakness. And so between now and November, they will create as much chaos as possible in order to prove themselves right. Just as their forebears did at Hoxie sixty-five years ago when they saw black and white children playing together, as happy as they could be.

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177327 https://historynewsnetwork.org/article/177327 0
Twenty-One Days Later: Ventura County's Participation in the Chicano Moratorium of 1970

 

 

 

Last month hundreds of people marked with moxie the 50th anniversary of the August 29th, 1970 Chicano Moratorium in East Los Angeles. To protest our nation’s war in Vietnam, racism, and police brutality, starting at 9 am that day nearly 30,000 ethnic Mexicans and their allies from all over the Southwest took to the streets in a 3-mile peace march through the boulevards of Atlantic and Wilshire.

 

Among many slogans, they chanted and held signs expressing, “¡Raza Si! ¡Guerra No!,” “Our Fight Is Not in Vietnam,  “Chicano Power,” and “Stop Chicano Genocide!”

 

In the spirit of the Black Lives Matter movement since George Floyd’s killing by now-former Minneapolis police officer Derek Chauvin this May, the protests of Chicanos today concentrate on law enforcement’s abuse of power.

 

In 1970 Chicanos protested how US casualties in Vietnam disproportionately consisted of young men from their communities in the Southwest. Dr. Ralph Guzmán documented that from 1961 to 1967 their brothers and friends made up 19.4 percent of those killed in action, when this group was only 10 to 12 percent of the national population.

 

Now, they protest the killings of Latina and Latino soldiers. Army Private First Class Vanessa Guillen stationed at Fort Hood being one and Specialist Enrique Roman-Martinez of the 82nd Airborne Division at Fort Bragg another. Roman-Martinez’s sister and mother delivered impassioned speeches at Atlantic Park in East L.A. before the commencement of the 50th-anniversary march this past August 29th. They criticized the Army for its less-than-transparent investigation and decried only having received Enrique Roman-Martinez’s partial remains. 

 

In 1970, the Brown Berets of Los Angeles, along with UCLA student Rosalio Muñoz and others formed the National Chicano Moratorium Committee and organized many demonstrations in Southern California. But the August 29th march and rally at the then-named Laguna park was the granddaddy of them all.

 

Then tragedy struck. With the pretext of a responding to a robbery at a nearby liquor store, Los Angeles sheriff’s deputies and police stormed the peaceful assembly with batons and teargas. The law enforcement-instigated riot resulted in three deaths and hundreds arrested and abused. Ruben Salazar, a former Los Angeles Times reporter turned KMEX-TV news director, considered the voice of the Chicano community, was one of the slain as he stopped at the Silver Dollar Bar far away from the melee, on Whittier Blvd, to decompress from law enforcement’s merciless assault.

 

After several contradictory official explanations, it was found that Los Angeles County sheriff’s deputy Thomas Wilson killed Salazar with a 10-inch teargas projectile designed to pierce walls. Many in the community contended then, and believe now, that the powers that be in Los Angeles conspired to assassinate Salazar due to his refusal to temper his reportage of law enforcement misconduct. 

 

In adjacent Ventura County, the Chicano community also viewed Salazar’s homicide as the system’s culling of its leadership. In a September 3, 1970 letter to the Ventura County Star-Free Press titled, “Siesta Is Over!” Arthur Gómez of Santa Paula addressed Governor Ronald Reagan and local elected officials when he stated, “Yes, the siesta is over! The siesta was broken by the murder of two innocent Mexican nationals in a Los Angeles hotel and the 10-inch projectile that shattered Ruben Salazar’s head… One day we shall not have our leaders murdered. One day we shall not have our children made ashamed of being part Mexican. One day we shall have justice and dignity.”

 

Intrepidly, Chicano men and women conducted a peace march in Oxnard on September 19th,  twenty-one days after law enforcement’s rampage in East Los Angeles. Approximately, 1,000 marchers from all walks of life, different communities, and a span of generations again took to the streets.

 

In their planning that started weeks, if not months, in advance of the August 29th tragedy, the organizers declared the community’s goal of liberation as well as the end of Chicano genocide in Vietnam and police brutality.

 

To avoid an August 29th-like catastrophe, the Brown Berets of Oxnard, the Ventura County chapter of the Mexican American Political Association, and MEChA representatives from local colleges and high schools met in advance with law enforcement.

 

The week leading up to the “La Raza” (the People’s) peace march, men and women of the Brown Berets leafleted neighborhoods to promote the demonstration and, to further ensure amity at the event, disseminated a code of conduct to the public, the Oxnard Police Department, and media.

 

On the day of the demonstration, people paraded boldly through the streets La Colonia barrio from La Virgin de Guadalupe Church and the downtown district with a coffin that symbolized 8,000 ethnic Mexican servicemen killed in Vietnam. The procession ended at the city’s Community Center. There, as national chairman for the Chicano Moratorium Committee, Muñoz characterized the Vietnam War as the “systematic murder” of Chicanos.

 

La Raza Moratorium Committee’s communication with law enforcement and the press garnered the community’s goodwill for the event’s achievement. Indeed, the Oxnard Press-Courier commended the organizers in an editorial as it acknowledged the disproportionate ethnic Mexican casualty rate in the Vietnam War. It also complimented in a backhanded manner law enforcement in general, for its “diplomacy and restraint.”

 

Fifty years later, Chicanos are proud of being ethnic Mexicans. But with the controversial homicides of Latino soldiers and civilians such as PFCs Guillen and Roman-Martinez on the one hand and Andres Guadardo, shot in the back by a LA County sheriff’s deputy, on the other, we, Chicanas and Chicanos, still await justice.

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177332 https://historynewsnetwork.org/article/177332 0
The "Noble Dead": Warren Harding and the Resting Places of the WWI Fallen

American cemetery, Aisne-Marne. Photo by author.

 

 

With controversy swirling around President Trump’s decision in 2018 not to visit Aisne-Marne, a World War I cemetery for American soldiers located some fifty miles outside Paris, one wonders why some American war dead from the Great War were left behind in France, and why some were brought home. A president from the time provides the answer, one who referred to fallen Americans not as “losers” but as “the noble dead.”

Warren Harding, our nation’s twenty-ninth president, not only received the first flag-draped wooden coffins to be returned from Europe after the war, he was also the Chief Executive who dedicated the Tomb of the Unknown Soldier at Arlington National Cemetery.

On May 23, 1921, two-and-one-half-years after the end of the fighting in Europe, 5112 coffins, containing bodies of soldiers, sailors, marines and nurses, newly returned from France, were carefully set out in a shipyard at Army Pier 4 in Hoboken, New Jersey. The rows of coffins stretched for city blocks. President Harding, who had just taken office in March, arrived via the presidential yacht, the USS Mayflower. While onboard, he composed a short address that reflected the solemnity and the expected shock of seeing so many caskets arrayed in one place.

“There grows upon me,” he said from a bunted platform erected in front of a single, representative coffin, “the realization of the unusual character of this occasion.” Because this simple ceremony had been hastily arranged, President Harding and First Lady Florence Harding appeared in front of what one correspondent described as “a pitiful little handful of soldier relatives while a guard of honor, grim in khaki and trench helmets, stood frozen at attention over their comrades.”

Harding recognized that “our Republic has been at war before, it has asked and received the supreme sacrifices of its sons and daughters, and faith in America has been justified.” But this display was different, unparalleled. “We never before sent so many to battle under the flag in foreign lands,” he said. “Never before was there the spectacle of thousands of dead returned to find their eternal resting place in the beloved homeland.”

The decision to bring remains home from foreign soil was a complicated, extended and negotiated affair. America had no established precedent to consult. When it became clear that there would be a staggering death toll during the Civil War, President Abraham Lincoln signed a law authorizing the creation of national cemeteries (which would include a cemetery at Gettysburg). For years after the war, the remains of Northern soldiers hastily buried near battlefields were exhumed and reburied in venerated cemeteries. And in the handful of small wars where Americans died overseas, sometimes remains were recovered, sometimes not.

 

Makeshift gravesite, France c.1918

 

World War I created a dual challenge. Nearly 75,000 Americans were buried in temporary graves in France and the cost to recover that many bodies was daunting. Moreover, leaders in France did not relish the idea of endless trains bearing disinterred remains of American dead rumbling through the countryside to ports for shipment back to the United States. France had its hands full with the staggering work to reclaim dangerous and devastated land, not to mention millions of corpses, from a war that had been waged mostly on its soil. So, France banned the repatriation of any bodies from January 1919 until January 1922, though it relented from the three-year ban in response to American pressure. Hence, it fell to Warren Harding, elected 100 years ago in November 1920, to meet the first returned.

In the United States many families demanded a return of their loved one’s remains, worried that they would be forgotten in unmarked or untended graves. The government decided to let families decide whether to seek the return of remains or to leave them where they had fallen, either in existing graves or in nearby official American cemeteries established in France. Ballots were sent to over 80,000 families to discuss and debate the decision. In the end, about 40,000 bodies were returned and 30,000 were left, buried almost exclusively in American cemeteries.

 

The names of dead and missing are engraved on a chapel wall near Belleau Wood. Photo by author.

 

Enter Aisne-Marne. This American cemetery is the final resting place for nearly 2,300 Americans. Built at the base of a hill on which stands Belleau Wood, the site of one of the most monumental battles of the war. This is where the Marines helped stop the German advance towards Paris in the summer of 1918. The Americans arrived just in time and the cost in human lives was severe. The Marine Corps venerates Belleau Wood as sacred ground, no doubt the reason that John Kelly, then chief of staff to President Trump, made the trip to Aisne-Marne even when the president bailed, allegedly because of weather.

Kelly was a retired 4-star general of the United States Marine Corps. His son Robert, also a Marine, was killed-in-action in Afghanistan in 2010. John Kelly knew the importance of visiting Aisne-Marne on the one-hundredth anniversary of America’s pivotal engagement in the war; he understood the duty to the families of those buried overseas in American cemeteries to remember and honor “the noble dead.”

Six months after Harding welcomed home the remains of the first 5,000 returned from Europe, he dedicated the Tomb of the Unknown Soldier at Arlington National Cemetery. On November 11, 1921, the third anniversary of the Armistice, Harding said it mattered little whether the unknown was “a native or adopted son.” The sacrifice was the same. “We do not know the eminence of his birth,” he added, “but we do know the glory of his death.”

 

Warren Harding and William Howard Taft observe the Unknown Soldier in state, U.S. Capitol. 

 

President Harding expressed the gratitude of the nation for the ultimate sacrifice of the warriors, what Lincoln called at Gettysburg the “last full measure of devotion.” But he challenged his fellow citizens to do more than to pay tribute to the fallen hero in the unknown tomb. He asked that every American "unite to make the Republic worthy of his death for flag and country.”

Just as Americans visit and revere the graves of those in Arlington and other national cemeteries in the United States, it is important to remember that the nation made a solemn compact with the families of those who were lost in the First World War. The government promised that the sons or daughters of those gold-star families would be buried in American cemeteries, cared for and tended to by Americans, so that no one would forget them or their sacrifice and so that Americans, when overseas, could locate and venerate their honored dead.

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177326 https://historynewsnetwork.org/article/177326 0
Richard Haass on the Need for Historically Informed Policy in a Changing World

 

Richard Haass is the President of the Council on Foreign Relations. He served as senior Middle East advisor to President George H.W. Bush and as Director of the Policy Planning Staff under Secretary of State Colin Powell and is the author of fifteen books, most recently  The World: A Brief Introduction. He discussed the work and the importance of historical understanding with HNN Contributing Editor David O'Connor. 

 

 

David O'Connor: Can you share the story of how a fishing trip sparked your interest in writing this book on history and international relations?

 

Richard Haass: The idea for writing The World: A Brief Introduction was sparked on a summer's day fishing with a friend and his nephew in Nantucket. The young man was about to enter his senior year at Stanford and would graduate with a degree in computer science. As we began talking, it became clear that he had been exposed to little history or politics or economics and would leave the campus with almost no understanding of why the world mattered and how it worked. When I got back to my office, I began looking into this issue and realized that a young American could graduate from nearly any high school or college in the country without taking as much as an introductory course on U.S. history, international relations, international economics, or globalization. To be sure, there are distribution requirements at nearly every college or university, but a student can choose to narrowly focus on one period of history or one region of the world without ever taking a survey course that provides a framework for putting it all together. I decided to write The World to provide that foundation for students or even people who had graduated from college decades ago but need a refresher. A democracy requires that its citizens be informed, and it was evident far too many citizens in the United States and other countries could not be described as globally literate.

Are you an advocate for universities and colleges to mandate a core curriculum?  If so, what courses would you want to see included in it?  

I am a firm believer in a core curriculum. Students (and their parents) should know before choosing to attend a particular institution just what it is they will be sure to learn. Would-be employers should know what a degree from a particular institution stands for. I believe a core curriculum should at a minimum include courses devoted to promoting critical skills (analysis, writing, speaking, teamwork, digital) and knowledge (world history, civics, global literacy). Such a core would still allow every student to have ample opportunity to specialize.

 

How have you and your colleagues at the Council on Foreign Relations encouraged those who are not in college to learn about world history and current international events?  Which efforts do you think have been the most successful?

We continue to publish Foreign Affairs, which releases a print edition six times per year and remains the magazine of record in the field. The magazine contains articles that present fresh takes and new arguments on international issues - the magazine published the famous "X" article by George Kennan that introduced Americans to the concept of containment, for example. Its website, ForeignAffairs.com, publishes shorter pieces every day more closely tied to the news cycle. On CFR.org we publish a host of backgrounders that aim to provide what a person needs to know to get up to speed on issues ranging from global efforts to find a vaccine for COVID-19 and U.S. policy toward the Israeli-Palestinian conflict to the role of the IMF and the U.S. opioid epidemic. We have also produced a series of award-winning InfoGuides on China's maritime disputes, modern slavery, and refugees, among others. We have a series of podcasts, including The President's Inbox, which each week focuses on a foreign policy challenge facing the United States, and another titled Why It Matters, which takes issues and as its title suggests explains to listeners why they should care about them. 

Just as important, a few years ago I created an entirely new education department at the Council. Its mission is explicitly to teach Americans how the world works. Its flagship initiative, World101, explains globalization, including climate change, migration, cyberspace, proliferation, terrorism, global health, trade, and monetary policy, regions of the world, the ideas basic to understanding how the world operates, and, as of early 2021, history. Each topic includes videos, infographics, interactives, timelines, and written materials. It also includes teaching resources for teachers who want to use the lessons in their classrooms. We have also created Model Diplomacy, which helps students learn about foreign policy and how it is made by providing free National Security Council and UN Security Council simulations.

 

You begin this book with an explanation of the Treaty of Westphalia, one that many people don’t know very well. Why did you start your study in 1648? How have the concepts and practices established in the Westphalian system endured?  

I started with the Treaty of Westphalia because the principles enshrined in those arrangements created the modern international system. The treaty (in actuality a series of treaties) established the principle of sovereignty that increased respect for borders along with the notion that rival powers ought not to interfere in the internal affairs of others. These agreements helped bring about a period of relative stability, ending the bloody Thirty Years War that was waged over questions of which religion could be practiced within a territory's borders. More important for our purposes, they put forward the principle of sovereignty that remains largely unchanged to this day. When you hear the Chinese government declare that foreign powers have no right to criticize what happens inside of China's borders, they are harkening back to Westphalia. At the same time, as I argued in my book A World in Disarray, this conception of sovereignty is inadequate for dealing with global challenges. For issues like climate change, global health, terrorism, and migration, what happens inside a country's borders has huge ramifications for other countries. For instance, Brazil's decision to open up the Amazon for commercial purposes and deplete this natual resource has negative implications for the world's ability to combat climate change. China's failure to control the outbreak of COVID-19 has caused massive suffering around the world. I introduced the concept of sovereign obligation to capture the idea that governments have certain responsibilities to their citizens and the world, and if they do not meet those obligations the world should act. The challenge will be how to preserve the basic Westphalian respect for borders (something violated in 1990 by Iraq in Kuwait and by Russia in Ukraine more recently) and at the same time introduce the notion that with rights come obligations that must also be respected.

 

How did Wilsonian idealism at the Versailles Conference propose to reform the Westphalian model?  Why did the effort fail to prevent another world war a couple decades later?  

Wilson famously declared the United States had entered World War I because "the world must be made safe for democracy." This was a decidedly anti-Westphalian statement, as he was in essence calling for the United States to transform other societies and influence their internal trajectory. The Treaty of Westphalia, as I mentioned above, emphasized that a country's internal nature was its own business, and countries should instead focus on shaping each other's foreign policies. It is too much to say that Wilson's approach failed to prevent another world war. World War II was the result of a convergence of forces, including the Great Depression, protectionism, German and Japanese nationalism, U.S. isolationism, and the weakness of international institutions, above all the League of Nations. What I would highlight about Wilsonianism is that it remains an important strain of American political thought. To this day, there is a school of American foreign policy that emphasizes the promotion of democracy, and, in some cases, the transformation of other societies. My personal preference is to focus our efforts mostly on shaping the foreign policies of other countries.

 

I found your coverage of what you call China’s “century of humiliation” to be one of the most interesting parts of the book. What were some of the key developments that led to this troubled period in China’s history?  How do you think this “humiliation” affects Chinese domestic and international policies today?

As I mention in the book, the "century of humiliation," as the Chinese term it, began with the Opium Wars and closed with the establishment of the People's Republic of China in 1949. It was mostly the result of the internal decay of the Qing Dynasty, which was in large part brought on by its inability to grasp the changes that were going on around it and adjust to the new reality. While Japan, following Commodore Perry's mission, modernized and attempted to catch up with the West in areas where it had fallen behind, the Qing Dynasty remained set in its ways, convinced that the world had nothing to offer China. More important, this "humiliation" shapes the Chinese Communist Party's (CCP) narrative and how it wants Chinese citizens to think about the world. In the CCP's telling, only a strong government can prevent foreign powers from taking advantage of China, while a fractious and weak country invites foreign aggression. Of course, what the CCP then claims is that only it can provide the stability and strength that China needs and uses this take on history to justify one-party rule and the repression of civil liberties.

 

Though you do not deny the hardships and missteps that occurred during the Cold War, you do offer a rather positive evaluation of the stability in the decades-long bipolar contest between the US and Soviet Union.  What were some of the features of the Cold War that helped manage the tensions between the superpowers and prevent the outbreak of a hot war?  Can some of these be applied to the current Sino-American relations?

 

We should not discount the role that nuclear weapons played in keeping the Cold War cold. Simply put, the specter of nuclear war kept the competition between the United States and the Soviet Union bounded, as any potential war between the two powers could have led to a nuclear exchange that would have decimated both countries and the world. Many international relations scholars argue that a bipolar world is inherently more stable than a multipolar one, because it is easier to maintain a balance of power and stability more broadly when there are only two centers of decision-making. I would add that the United States focused most (although not exclusively) on the Soviet Union's international behavior and did not seek to overthrow the regime. There was a measure of restraint on both sides. Finally, there were frequent high-level summits, arms control agreements, and regular diplomatic interactions. These all helped set understandings for each side and communicate what would not be acceptable to each side. 

In terms of Sino-U.S. relations, I believe nuclear deterrence will work to lower the prospect of war between the two countries. I am concerned, though, that we do not have a real strategic dialogue with China. We need to be able to sit in a room with each other and at an authoritative level communicate what we will not tolerate in areas like the South China Sea, the East China Sea, and the Taiwan Strait. The chances of miscalculation are too high. I also believe we should focus less on China's internal trajectory and more on shaping its foreign policy. We cannot determine China's future, which will be for the Chinese people to decide. We should continue to call out the government's human rights abuses in Xinjiang and its dismantling of Hong Kong's freedoms, but we should not make this the principal focus of our relationship. Instead, we should compete with China, push back against its policies that harm U.S. interests, and seek cooperation where possible with China to address global challenges. 

 

In the Cold War era, both Europe and parts of Asia experienced tremendous economic growth, peace, and prosperity. What role did the United States play in facilitating these positive outcomes?  Are there lessons from Europe and East Asia that can be applied to other parts of the world today?

First of all, we should give credit to the people of Europe and Asia for their tremendous economic success. In terms of the U.S. role, there was of course the Marshall Plan in Europe that provided the funding Europe needed to get back on its feet and rebuild after World War II. In Asia, the United States gave significant aid to its allies. The point I would make is that this aid was not done purely out of altruism. Instead, it furthered U.S. interests. It ensured Western Europe did not go over to the Soviet Union and that U.S. allies in Asia could be stronger. Foreign aid continues to be an important tool in our foreign policy toolbox, and we should continue to use it to further our interests. For instance, with China extending its reach around the globe through the Belt and Road Initiative, the United States should respond with a better alternative that would provide funding for infrastructure in the developing world but make it conditional on the infrastructure being green and on the countries undertaking necessary reforms. Trade can also be a powerful tool for promoting development.

 

What are some of the key developments that undermined the great hope that followed the end of the Cold War?  

In many ways, the Cold War was a simpler time for U.S. foreign policy. The country had one adversary, and it could devote most of its resources and the bulk of its foreign policy apparatus to addressing it. After the Soviet Union collapsed, containment lost its relevance, and U.S. foreign policy lost its compass. The United States enjoyed unparalleled power, but no consensus emerged as to how it should use that power: should it spread democracy and free market economics, prevent other great powers from emerging, alleviate humanitarian concerns, tackle global challenges, or something else? I've begun calling the post-Cold War period of U.S. foreign policy "the great squandering" given that U.S. primacy was not converted into lasting arrangements consistent with U.S. interests.

I would point to a few U.S. missteps that set back its foreign policy agenda and undermined the hope you refer to. First there was the mistaken 2003 invasion of Iraq, where the United States initiated a war of choice in the hope of transforming the country and the region. The Iraq War, and the nation-building effort in Afghanistan, soured many Americans on their country playing an active role internationally. Simply put, they believed the costs of such a role outweighed the benefits. Now, as the United States faces challenges from China to Russia, Iran, and North Korea, Americans are weary of getting involved. Relations with Russia soured, some would argue at least in part because of NATO enlargement.  The 2008 global financial crisis raised doubts worldwide about U.S. competence, as has the American response to COVID-19. In short, the relative position and standing of the United States have deteriorated.

 

After World War II, the United States helped construct what you call the liberal world order.  What are the key features of this order?  What do you consider its greatest strengths and weaknesses?  

The liberal world order is an umbrella term for the set of institutions the United States helped to create in the wake of the Second World War, including the United Nations, the World Bank, the International Monetary Fund, and the General Agreement on Tariffs and Trade (now the World Trade Organization). It was rooted in liberal ideas of free trade, democracy, and the peaceful settlement of disputes, and was also liberal in the sense that any country could join the order as long as it abided by its principles. It was never truly a global order during the Cold War, as the Soviet Union and its satellite countries opted out of many of its elements. 

The great strengths of the liberal world order are that it has promoted unprecedented peace, prosperity, and freedom. But increasingly it is being challenged. Its liberalness is rejected by authoritarian regimes. Many governments or non-state actors are not prepared to hold off using force to advance their international aims. In addition, the order has had difficulty adjusting to shifting power balances (above all China’s rise) and in developing collective responses to global challenges such as climate change, proliferation, and the emergence of cyberspace.

 

China’s emergence as a world economic power has greatly challenged this liberal world order and efforts to get it to conform to some of its basic principles have come up short.  How can other countries persuade and/or pressure China to adhere to the practices and rules of institutions (e.g., the World Trade Organization) dedicated to upholding the order?

First, it is fair to say that some institutions, such as the WTO, were not set up to address a country such as China, with a hybrid economy that mixes free market enterprise with a large state role. And the WTO failed to adjust sufficiently to China’s rise. The United States should be working with its allies and principal trading partners to bring about a major reform of the WTO. More broadly, the single greatest asset that the United States enjoys is its network of alliances. China does not have allies, whereas the United States enjoys alliances with many of the most prosperous and powerful countries in Europe and Asia. The United States needs to leverage those alliances to present a united front in pushing back where China does not live up to its obligations. It should also work with its allies to develop an alternative 5G network, for example, and negotiate new trade deals that set high standards and would compel China to join or risk being left behind. In the security realm, it should coordinate with its allies in Asia to resist Chinese claims to the South China Sea and make clear to China that any use of force against Taiwan would be met with a response.

 

Despite the fact that the US was a driving force behind establishing and maintaining this liberal world order, many Americans have grown weary of the costs involved and fail to see how it benefits them.  Indeed, this was a key feature in President Trump’s 2016 campaign message and continues to influence his foreign policy.  How can policymakers who want to continue American leadership in this order persuade Americans that the system actually benefits them?  

Policymakers need to be more explicit in highlighting the benefits of the liberal order and contextualizing its costs. We avoided great power war with the Soviet Union and the Cold War ended on terms more favorable to the United States than even the most optimistic person could have imagined. Global trade has skyrocketed, and America remains the richest country on earth. Alliances have helped keep the peace in Europe and Asia for decades. In terms of the costs, defense spending as a percentage of GDP is currently well below the Cold War average, which was still a time Americans did not have to make a tradeoff between butter and guns. We can assume a leadership role abroad without sacrificing our prosperity. On the contrary, playing an active role internationally is necessary in order to keep America safe and its people prosperous. The United States may be bordered by two oceans, but these oceans are not moats. Even if we choose to ignore the world, it will not ignore us. Both 9/11 and the COVID-19 pandemic have made this abundantly clear. 

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177324 https://historynewsnetwork.org/article/177324 0
Making History with Music

PFC Richard Burt, March Field, Riverside, CA 

 

75 years ago World War II ended, but the stories of the men and women who served continue to be woven into the history of the United States.  When the war ended, Corporal Richard Burt attended the Juilliard School of Music with his constant wartime companion, his trumpet.  During the war, the 19 year old private served in the 746th Far East Air Force Band in the Philippine Campaign and shared musical experiences with front line troops, generals, foreign dignitaries, and some of the most famous service members taken prisoner by the Japanese when U.S. forces in the Philippines surrendered in 1942.  

His journey began stateside, in the band at March Field in Riverside, California.  It was there that he learned under the best in show business: “I learned an awful lot about blowing my horn there.  Three-fourths of March Field had been professionals in Hollywood recording industries or were members of nationally known big swing bands.”  Richard was challenged musically by military formations at March Field as well, when he was asked to play taps for the first time as a 19 year old PFC at the funeral service of legendary World War II pilot Lt. Col. William “Ed” Dyess.  Dyess was well known for his bravery at Bataan and as one of a group of twelve POWs who made the only successful mass escape from a Japanese POW camp during the entire war in the Pacific.  Dyess’ final act of bravery earned him the Soldier’s Medal when his plane malfunctioned over Glendale, California and he chose to crash land in a vacant lot rather than ditch his plane and possibly kill or injure any civilians.  “I’ll always remember the first time I played taps in the war for William Dyess.  He was a real hero and that experience always stuck with me.”

 

Lt. Col. William "Ed" Dyess, 1943, after his return from his POW experience

As the war in the Pacific raged in 1944, a call came to March Field to furnish a trumpet player with a sergeant’s rating for overseas assignment in a newly formed band.  Being the youngest in the group and only a PFC, Richard volunteered, “All of our Sergeants were married, so, I marched into our Chief Warrant Officer’s office and asked if I could take the place of that married man that was slated to go.”  His request was granted and he was off to an unknown destination with his new companions, the 746th Far East Air Force Band.  

Upon arrival to Leyte Island in the Philippines, the newly formed band played their first show with a USO group for front line soldiers.  “We played that show three nights in a row, the last being up at the front.  It was an area where all the palm trees had been blown in half.   A make shift stage had been set up and when we arrived there were GIs climbing up these half blown up palm trees to attach spotlights for the show.  All the men who came to see the show came in their ponchos with their helmets on and their rifles sticking out.  As the show progressed, across the ravine, there would be sounds of automatic weaponry and you could see the flashes every once in a while that the shooting made.  So, even while we performed, on the other side of the ravine, there was action going on.  That was as close as I ever came to fighting in that war.”  Stories such as this exemplify the experience of the front line band and the role that they played in World War II, using music to give a respite to soldiers, sailors, marines, and airmen and make them feel a little more connected to home, even on the other side of the world in a combat zone. 

 

PFC Burt practices his trumpet in the jungle of Letye

 

As the Philippine Campaign progressed, Richard and the 746th Far East Air Force Band would move into the City of Manila on Luzon, eventually being stationed at the headquarters of the Far East Air Force at Fort McKinley, under the command of General George Kenney.  It was here that the band would cross paths with another legendary group from the Philippines: the Army and Navy Nurse Corps “Angels of Bataan” who had been freed from their POW camp at Santo Tomas in February of 1945.  According to Richard, “We played for one formation only, that is a military formation, and that was after the war had ended with Japan.  This was when nurses who had been taken prisoner when Corregidor fell were given medals.  We stayed in the shade to play that formation and for good reason, it was suffocatingly hot.”

 

Army Nurses, popularly known as the "Angels of Bataan," awarded the Bronze Star. Leyte Island, 1945

 

As the war came to an end and the unit was to break up, it was decided that it would be fun to record the group.  It was done in the band’s rehearsal tent, using a wire recording.   In what was described as a fatiguing session, the band recorded themselves on wire playing ten chart topping big band hits, musically arranged by members of the 746th.  Upon completion of the recording session, lead trumpet player PFC Richard Burt would ask his commanding officer if he could have the recordings to take home. Chief Warrant Officer John Washburn granted Richard’s request and he brought them back to his home in Salt Lake City, where he took them to KSL Radio and had them transferred to records.

Richard’s life after the war always included music and his long time war companion, the trumpet.  He graduated from Juilliard School of Music in 1953 and received his BA and MA in music education from Drake University in Des Moines, Iowa.  He passed his passion for music on to his family and students that he taught in the public school systems of San Francisco and West Sacramento.  He kept the recordings of his World War II band safe for 75 years, but lost track of them in his home in the 1980s.  Richard would pass away in August of 2016 at the age of 92.

When his wife Marilyn passed away in October of 2019, I found my grandfather’s misplaced recordings in his attic. As a historian, I was bound to preserve this one of a kind artifact and honor my grandfather, his band mates, and all those who served during the war.  Working with a 4 time Grammy award winning sound engineer, I am producing the band’s work into a modern album with a 28 minute narration on the band and military experience of Richard Burt that was recorded by Richard in the 1980s.  The project has multiple goals, but the boldest would be historic: To take an album created by a front line Army band in the Pacific 75 years ago and make a band of World War II Veterans a platinum selling artist by selling a million albums.  The album created 75 years ago by the 746th Far East Air Force Band will be available for sale on Veteran’s Day of 2020. A portion of the proceeds from album purchases will be donated to the United Service Organization (USO) and the World War II Museum in New Orleans.  If you wish to follow this World War II music project as it unfolds, you can follow the 746th Far East Air Force Band on Facebook https://www.facebook.com/746thFEAFband/ or on Twitter @746thFEAFband.

 

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177325 https://historynewsnetwork.org/article/177325 0
A Conversation with Seattle Author Dr. Lawrence Matsuda on His Debut Historical Novel "My Name is Not Viola"

Lawrence Matsuda portrait by Alfredo Arreguin

 

 

 

On December 7, 1941, forces of the Japanese Empire attacked the American naval base at Pearl Harbor and left hundreds of American military members and civilians dead or wounded. In response to the surprise attack, the United States declared war on Japan the next day. The attack on America inflamed anti-Japanese sentiment and hysteria that led to hate crimes, particularly on the West Coast, against aliens and US citizens of Japanese extraction—and those who looked like them.

Under President Franklin D. Roosevelt’s February 1942 Executive Order 9066, the US government forcibly removed 120,000 people of Japanese ancestry from their homes and incarcerated them in concentration camps.  Most of these interned people were kept in the camps until 1945, with the exception of early releases of a few, such as the valiant souls who volunteered to serve in the American armed forces, including members of the Japanese American 442nd Regiment that became the most decorated American unit of the war. Others were released to attend college or work in defense industries like munitions factories in areas away from the West Coast.

The unfortunate internees subjected to the harsh and dehumanizing conditions of the prison camps had committed no crime but were rounded up, dispossessed, and detained unconstitutionally based only on their ancestry and race. And about two-thirds of the internees were United States citizens. 

The detainees included Hanae and Ernest Matsuda who, with removal in 1942, lost their home and grocery business in Seattle. Like thousands of others, they were evacuated without due process and were incarcerated at the Minidoka concentration camp in Idaho where Hanae gave birth to two sons and a stillborn child.

Hanae and Ernest Matsuda’s youngest son Lawrence was born in 1945 in Block 26, Barrack 2, of Minidoka Camp. Their baby’s prisoner number was 11464d. 

Now Dr. Lawrence Matsuda, a renowned Seattle writer and human rights activist, brings to life his mother’s travails, traumas, and triumphs in mid-20th century America in his debut historical novel My Name is Not Viola. The events experienced by the fictional Hanae of the novel mirror actual incidents in the life of his mother including her girlhood in Seattle’s Japantown; her pre-war journey to Hiroshima, Japan; her removal from her Seattle home and incarceration at the brutal Minidoka concentration camp; her quest for Hiroshima relatives after the atomic obliteration of the city; her marital woes; her severe depression and incarceration at Western State Hospital, a psychiatric facility; her resilience grounded in Japanese and western beliefs; and her evolution as a force for good.

The novel captures the rhythm of life in Seattle’s Japantown, the unrelenting misery of internment at the Minidoka camp, and the pain and loss of internees as they returned home after the war to face dispossession and poverty. This history through the eyes of the fictional Hanae grips the reader with its lively writing and evocative imagery while sharing an important and heartbreaking chapter from our American experience. Yet it is also a story of hope and triumph despite recurrent traumas—and quite timely as we face an unprecedented pandemic and political crises today.

Dr. Matsuda is known in Seattle as a voice for social justice, equality, and tolerance. He is a former secondary school teacher, administrator, principal, and professor. He received an MA and PhD at the University of Washington.  

As a writer, Dr. Matsuda is most well-known for his poetry. His first book of poems, A Cold Wind from Idaho, was published by Black Lawrence Press in 2010. He has published two other books of poetry, one in collaboration with renowned American poet Tess Gallagher, as well as a graphic novel about the Second World War experiences of the Japanese American 442 Regimental Combat Team. Chapter one and two of that graphic novel were animated by the Seattle Channel and both won regional Emmys, one in 2015 and the other in 2016. His poems have appeared in many publications including Raven Chronicles, New Orleans Review, Floating Bridge Review, Poets Against the War website, Nostalgia Magazine, Plumepoetry, Surviving Minidoka (book), Meet Me at Higos (book), Minidoka-An American Concentration Camp (book and photographs), the Seattle Journal for Social Justice, and many others. And he co-edited the book Community and Difference: Teaching, Pluralism and Social Justice, winner of the 2006 National Association of Multicultural Education Phillip Chinn Book Award. 

And Dr. Matsuda continues to work tirelessly for a more just and tolerant nation.

He graciously talked about his new novel and his writing career by telephone from his home in Seattle.

 

Robin Lindley: You had a successful career as an educator, administrator, and professor. How did your “encore career” as a poet and writer come about? 

Dr. Lawrence Matsuda: When I got my PhD, I decided to take something fun because the PhD was tough sledding and not always enjoyable. So, I took a poetry class from Nelson Bentley. 

Robin Lindley: He was a beloved professor at the University of Washington.

Dr. Lawrence Matsuda: Yes. I enjoyed it a lot. I attended his class several times and read for the Castilla reading series for several years. He always encouraged me to publish my poetry. He was a good person and took great pride in having his students published. 

I moved my energy into poetry after my PhD, and continued to write poetry when I was working. Most of it was not great, but mediocre poetry. 

In about 2008, I decided to get good at poetry. I worked with Tess Gallagher. She helped me with my first book of poetry A Cold Wind from Idaho. I thought I was done because I had worked with some other people who helped. I gave the manuscript to my friend, the artist Alfredo Arreguin, and he said Tess Gallagher was coming to his house, and that he would show the book to her. Evidently, she was taken by the manuscript, but decided it needed revisions. She worked with me for about a year, mostly electronically. We finally met and I submitted to Black Lawrence Press as part of a contest. It didn't win first prize, but received honorable mention, and it was published in 2010. Currently more than 1,300 copies are in print.

Robin Lindley: Thanks for sharing that story. It’s wonderful that one of our great American poets, Tess Gallagher, helped launch your writing career. Now you've written this historical novel, My Name is Not Viola, based on the life of your mother. What sparked a novel at this time? Did you see it as a memoir for you as well as the story of your mother? 

Dr. Lawrence Matsuda: It started as a play in the Minidoka [concentration camp] canteen where old guys were sitting around and talking in a general store--cracker barrel scene.

I decided that the play wasn't going anywhere. It was just talking, and it needed a little more action. So, I looked to my own life and I compared it to my mother's and my mother had a much better story. 

It's not a memoir because some of it is fiction, and it’s not an autobiography. It follows the same character in the first person from beginning to end. It’s a historic novel that looks very much like the memoir.

The bones of the novel are my mother’s story and that structure is true. My mother was born in the United States. She went to Japan and was educated there. She came back to the United States, and thengot married. She was incarcerated. And she went to a mental hospital. So, all the bones are true, and to add flesh, I borrowed some of the stories that she told me. I filled in the blanks and then, to move the story farther, I added stories that I heard from other people about Minidoka. 

I’ve made pilgrimages to Minidoka six or seven times. They have a story time when former internees talk about being there. I borrowed some of those stories, and then farther out, I brought in stories of my friends, and then way out farther it was just fiction. So, the book is historic fiction based on the general outline of my mother's life. 

What motivated me is, I have always thought that each person has a good story, and at least one novel. I decided I needed to write and find my one novel, but it wasn't my story. It was my mother's story. 

The other thing is that I’ve always felt an artist should keep moving. I went from poetry to a graphic novel, to a kind of a poetry exchange with Tess. and then to a novel. I'm always trying to do different things. I think an artist should always try something new. Because the incarceration is so powerful it is very tempting to dwell on it and not move forward.  For the novel, I wanted to present the context of the incarceration and the afterward to give a larger perspective. 

Robin Lindley: Thanks for your words on your process. How did you decide on the novel’s title, My Name is Not Viola?

Dr. Lawrence Matsuda: I found my mother’s high school annual and there were inscriptions like “Good Luck, Viola.” I asked her who Viola was, and she said her teacher gave her that name. 

Robin Lindley:  In your novel, you take your mother’s life and add to the story. Picasso said that art is the lie that tells the truth. You share an engaging human story that deals on so many levels with the forces of history such as racism and injustice and the aftermath of war. It’s incredible how much she dealt with in her life.

Dr. Lawrence Matsuda: There are 120,000 stories of people who were 

forcibly incarcerated and each one is different but similar. They all experienced the same thing at different levels. My story is only one of 120,000. 

Robin Lindley: You were born a Minidoka in 1945 so you must not have any direct memory of the internment.

Dr. Lawrence Matsuda: No, but I do have borrowed memories. No matter what, at every Christmas, every Thanksgiving, every New Year's party, every wedding, funeral, the evacuation and the incarceration always came up. It's just a part of life. And I have these borrowed memories that usually focus on the worst of the experience. 

I don't have clear memories in the traditional sense, but my friend, a psychiatrist, says that, when my mother was pregnant, more than likely some chemicals were sent to me in her womb and that affected me in terms of fear and stress that made up my personality. And he also has said that, when he talks to someone who has deep problems, oftentimes he asks if their grandparents suffered any problems? He says big traumas are passed down for three generations. He feels that what happened to your grandparents and your parents is relevant to your current situation. 

Robin Lindley: I’ve heard about studies on genetics and past trauma. There are several studies with grandchildren and children of Holocaust survivors. 

Dr. Lawrence Matsuda:  So the trauma is passed down, and somehow you adjust. The third generation of trauma can still affect you.

Robin Lindley: So, we’re haunted by the traumas of earlier generations. You deal with almost a century of modern American history in the book. What was your research process as you wrote the novel?

Dr. Lawrence Matsuda: I went to Minidoka about six or seven times. In 1969, I taught the first Oriental American history class in the state of Washington at Sharpless Junior High School—now Aki Kurose Middle School. So I was interested in history and, while there, a number of things happened. I met Mineo Katagiri, a reverend who founded the Asian Coalition for Equality, and we worked together. 

Later on, some members of the Asian Coalition for Equality and I confronted the University of Washington because they were not admitting Asian students into their educational opportunity program (EOP). At the time, it was called the Special Opportunity Program, which served poor whites, blacks, Latinos, and Native Americans, but not Asians. 

And so, my interest in history took a step into activism. Ironically, it did again with the kids in the Oriental American history class. At that time, we were still referred to as “Orientals” and the term “Asian” was emerging. The class made a display of miniature barracks like those at Minidoka for an exhibit called “The Pride and the Shame,” a Japanese American Citizen League’s traveling exhibit for the University of Washington Museum. 

Bob Shimabokuro in his book, Born in Seattle, writes about how the traveling exhibit was the impetus for the reparations movement for Japanese Americans. So, my history interest moved me into activism, and my activism was rooted in history, especially anti-Asian, anti-Chinese, and anti-Japanese prejudice which culminated in the forced incarceration.

Robin Lindley: Thank you for your work for change. To go back to your novel, I’m curious about the story of your main character Hanae, who is based on your mother, and your mother's actual experiences. Did your mother go to Hiroshima, as in the novel, when she was about nine and have a rather dismal experience with her relatives, especially her older brother’s wife?  

Dr. Lawrence Matsuda: That was not true. She was born in Seattle and she went to Japan at age one and she returned with her mother and brothers about eight years later.  Her father stayed in Seattle and sent money home to Hiroshima when the family was there. And when she was nine years old, she came back to Seattle. When she was 21, she returned to Hiroshima to live with her older brother and that's when she couldn't get along with her sister-in-law and left after a year. 

Robin Lindley: And did she have an older brother Shintaro who was an officer in the Japanese Navy? 

Dr. Lawrence Matsuda: Yes. He was a submarine officer. He was not a captain, but he was a high-ranking officer on a submarine. He mentioned that the warlords were feeling very confident because of the victory over a Western power in the Russo-Japanese War.

Robin Lindley: The militarists were building sentiment for war in Japan in the early 1930s. In your novel, you depict the removal, the evacuation, and the internment vividly. Was your depiction of Hanae’s story in the novel similar to what your mother experienced in the shocking removal and then the incarceration. 

Dr. Lawrence Matsuda: Yes, it was as described

I think most of the Japanese were shocked. They knew that the Japanese nationals were at-risk as non-citizen aliens. There was a law that wouldn't allow them to become naturalized citizens, so they were aliens. That would be her father's generation. But the initial thought among the Japanese was that they would not take the Nisei [second generation] who were US citizens. So, they were shocked when citizens were taken because it was totally unconstitutional and un-American. You don't round up and arrest citizens for no crime without due process, right?

Robin Lindley: Didn’t the US government contend that the order of evacuation and internment was to protect people of Japanese origin because of extreme anti-Japanese sentiment after the Pearl Harbor attack?

Dr. Lawrence Matsuda: Some people used that excuse, but that wasn't the reason that they were evacuated. If you read the actual evacuation notice, it says all persons of Japanese ancestry, alien and non- alien, were to report to designated locations. And overnight the Nisei, who were citizens, became non-aliens. 

Robin Lindley: And weren't the families and others of Japanese ancestry actually rounded up by troops armed with rifles with fixed bayonets? 

Dr. Lawrence Matsuda: Yes. There were troops. The people were told to report to certain places.  The earliest pickups were done by the FBI. They took  mostly first-generation people who were leaders of the community shortly after Pearl Harbor while the bulk of Japanese were taken in April. 

Robin Lindley: It was a heartbreaking violation of human rights and the rule of law. What happened once these citizens and non-citizens were rounded up? What happened to their property and possessions? 

Dr. Lawrence Matsuda: It was different in every region of the country, but here the Japanese obviously sold off a lot of their goods at fire sale prices. And they stored some items. My parents actually stored some goods at a storage company and also at the Buddhist church. 

There were people in rural areas who left their land to others to care for. For example, on Bainbridge Island, some leased their land to their Filipino workers. They did take care of it and when the Japanesereturned, the land was in good shape. And some of the Japanese split the land with the Filipino workers. Other Japanese left the land and it was totally in disrepair when they came back. Many couldn’t keep their properties because they couldn't pay the taxes. So it was lost. 

There are countless stories. One storeowner left his ten-cent store to a Jewish man to care for. I think he was a jeweler who watched the boarded-up store and took care of it. Nothing happened to that store, but other places such as farmhouses were destroyed, especially when they came back. A farm house was burned on Vashon Island. There were farm houses vandalized in anti-Japanese incidents in Hood River where the whole town signed a petition not to permit the Japanese to return--but the Japanese did anyway. 

Each place has a different story, but overall, most of the people lost their businesses. Most of them lost their jobs. Most of them lost their homes. Most of them sold whatever they had at huge discount. So it was a very difficult time. Goods were sold for a penny on the dollar and customers took advantage because they knew that the Japanese were vulnerable. 

Robin Lindley: You have some remarkable scenes in your novel. I was struck when some white person wanted to buy a piano for a dollar. 

Dr. Lawrence Matsuda: Yes. The Japanese knew they couldn't take it with them. And, if a store was going out of business, they would sell at a huge discount on all goods. They were trying to make something, no matter how small.

Robin Lindley: Were their physical attacks on people of Japanese origin following the Pearl Harbor attack? 

Dr. Lawrence Matsuda: I hadn’t heard of any physical attacks. I know some Filipinos were beaten up because they were thought to be Japanese. The Chinese wore buttons saying “I am Chinese.” And I know that there was a man who was impersonating an FBI agent and he tried to do some bad things to Japanese women. 

Robin Lindley: That was such a time of fear and hysteria. What are some things you’d like people to know about the conditions of the concentration camp at Minidoka where your parents were held and where you and your brother were born? You describe the circumstances vividly in your novel. 

Dr. Lawrence Matsuda: They were in the desert. The food was not always sanitary. The quarters were cramped. There was no privacy. People had to use the latrines instead of regular toilets. There were scorpions and rattlesnakes and dust storms. 

All of that was just a given, but the worst part of it was being betrayed by your country. I compare it to rape. The whole community was raped and we handled it like rape victims. Some were in denial and others tried to prove that they were good citizens. Some committed suicide. Others were just depressed. So, the worst part of it was the mental realization that the whole community was raped. And very few on the outside really cared. I compare it to a rape by your uncle--by someone you trust in your family. It was a rape by our Uncle Sam.

Robin Lindley: And wasn’t the internment out of sight and out of mind, without much press coverage or any outside attention? 

Dr. Lawrence Matsuda: Yes. Minidoka was tucked into a ravine and 9,000people were imprisoned there. If you drove by, you wouldn't even see Minidoka even though it was the third largest city in Idaho at the time.

The physical conditions were bad, but I think the mental trauma was really devastating. The fact that your country betrayed you. And afterwards. Think about it. Who can you trust if you can't trust your government to protect you and maintain your rights? Who can you trust? 

Robin Lindley: That history is devastating. What sort of housing did your mom and dad live in there at the concentration camp? I understand the shelters were very crude and crowded with little privacy.

Dr. Lawrence Matsuda: They lived in barracks that were hastily constructed. They had tar paper on the outside and weren't shingled or sided. It was like army barracks. It was open and they used blankets as curtains, and several families shared each building. The noises and the smells spread. The barracks were heated by a pot belly stove that burned coal.

At the first relocation center, my parents were given ticking and sent to a pile of straw to stuff a mattress. That's what they slept on at Camp Harmony in Puyallup, which was actually a county fairground. Some of the bachelors lived in the horse stalls that still had horse smells. My cousin got the measles and was quarantined in a horse stall. 

When they moved to the permanent camps, like Minidoka and the other camps, they lived in hastily-constructed, army-style barracks with cracks in the floors, cracks in the walls. The wind would blow through. And the barracks all looked alike so people could get lost and wander into your area at night. 

Robin Lindley: And there were extreme temperatures in the hot summers and cold winters. The weather must have been miserable. 

Dr. Lawrence Matsuda: It was cold and muddy in winter. The residents had to walk on boards that were laid down on the mud. And that was how they got to the mess hall. My mother would never eat Vienna sausage because it caused dysentery several times. 

Robin Lindley: And wasn’t healthcare limited? 

Dr. Lawrence Matsuda: There was a patient hospital on site. When there was an outbreak of dysentery, you had to line up at the latrine with everyone else, because everyone who ate at the same mess hall had dysentery. One night, the lines were so long and the internees were upset, the guards thought there was a riot. Soldiers were going to shoot. The residents shouted, “No, no, it's dysentery. We've got the trots.” And so, the soldiers left them alone.

Robin Lindley: When your parents were released from Minidoka with you and your brother, they returned to Seattle where they had been dispossessed. And your mother was facing the additional trauma of dealing with the probable deaths of her relatives in the atomic bombing of Hiroshima. 

Dr. Lawrence Matsuda: They actually released many people at Minidoka before the end of the war to work, attend college or join the army. My father left several times to find housing, which he never found.  So, they stayed in camp until it closed. The administration shuttered it down, turned off the electricity, and told them to leave, and gave them a train ticket and $25.  

Back in Seattle, my family stayed in the basement of my mother's friend's house for a while. We lived there until my dad could find proper housing, but it was in short supply because of the war and the GIs coming back. 

It was not an easy time. And, there was racial real estate redlining in Seattle, so we couldn't move to the best part of town. We could only move to certain parts of town. If those areas were taken, it was tough luck. And in fact, some of the Japanese who moved out of the Central Area returned and found that African-Americans who came up from the South to work during the war had moved into the redlined area.  

Robin Lindley: That’s another tale of discrimination in America, and we're still living with the results of racist red lining. Thanks for sharing that insight. I didn't realize the effect on the Japanese community. Your mother must have been shaken by the terrible atomic bombing of Hiroshima and the lack of news about her relatives. 

Dr. Lawrence Matsuda: Yes. The first news they heard was that Hiroshima was bombed. Tokyo had suffered firebombing with more or less conventional bombs like napalm, but the residents did not understand what an atomic bomb was and the results.  

Recently, I read an article about how the US was suppressing news about the Hiroshima destruction until John Hersey visited Hiroshima and wrote his famous book, which revealed the aftermath. 

The news came in very slowly. It wasn’t like today when, if something happens, CNN is there by the next day. This news dribbled in. They knew that Hiroshima was destroyed, but they didn't know quite what that meant. It was the instantaneous destruction that was hard to comprehend. You could understand something being destroyed slowly, but everything in Hiroshima was vaporized or destroyed in an instant. 

My mother didn't know what happened to our relatives. It was only because of our relatives in the countryside that she found out the full story. But it was tough for her because she had lived in Hiroshima and she knew the city, so it was really devastating to realize that the city and many of her relatives were gone instantly. 

The people of Hiroshima were not soldiers. Soldiers expect to be put in harm's way and die, but these were civilians: old women, old men, young children, and workers.  They were evaporated and destroyed instantly or many died later of radiation sickness. 

Robin Lindley: Have you traveled to Japan and visited Hiroshima? 

Dr. Lawrence Matsuda: Yes. I was actually in Hiroshima during the 50th anniversary of the bomb.  It is a strange city. Kyoto is very old. You see the shrines and the old architecture. Hiroshima is modern. It doesn't look like a Japanese city, but a modern city because it was totally destroyed. And in real life, our family home was only a thousand meters from ground zero. 

Robin Lindley: That visit must have been very moving for you then. Now it’s the 75th anniversary. 

Dr. Lawrence Matsuda: Yes. But I was surprised too when I met my relatives, the children and grandchildren of my mother's oldest brother. They were all very positive, very healthy, and very energetic. They were generally happy people. I met Akkiko who survived the bomb. She was in the family home at the time.  I met her son, and her son’s son. So it seems life goes on. 

Robin Lindley: Yes, that’s encouraging. Didn’t Akkiko suffer radiation illness and severe burns?  

Dr. Lawrence Matsuda: Yes. She’s mentioned in the book. 

Robin Lindley: Your description of Hanae’s treatment for depression at Western State Hospital, a psychiatric facility, is very moving. It happened in 1962 and you juxtapose her experience with the Cuban Missile Crisis. You also destigmatize mental illness. Does your portrayal in the novel parallel your mother’s own “incarceration” at the hospital when she was admitted for severe depression? 

Dr. Lawrence Matsuda: I really couldn't say for sure because she never talked about it. But I did talk to my friend who is a psychiatrist.  He took me to the Western State Hospital Museum and I saw what it was like, and I knew what they did at the time. I studied the hospital’s history and learned that doctors specialized in lobotomies at the time.

Robin Lindley: Did you visit your mother when she was in the hospital? You must have been a teenager then. 

Dr. Lawrence Matsuda: I visited her once. They wouldn't let me go inside. We had to meet her in front of the hospital, in the parking area, at the turnaround. She came out to see us.

Robin Lindley: What do you remember about that visit?

Dr. Lawrence Matsuda: She was very thin and she looked worse than when she entered. 

Robin Lindley: And what kind of treatment did she receive? Did she actually have shock treatment or electroconvulsive therapy? 

Dr. Lawrence Matsuda: I'm sure she did. My psychiatrist friend told me that was pretty standard. 

Robin Lindley: Did your mother seem depressed to you before she was hospitalized? Did she talk about suicide? 

Dr. Lawrence Matsuda: Yes, she seemed depressed, and she was very distant and not engaged. But she did admit to her sister-in-law that she was contemplating suicide. 

Robin Lindley: Wasn’t there almost an epidemic of suicide among the internees after the war?

Dr. Lawrence Matsuda: Yes. There’s no real data on that because nobody kept track of it. But I talked to Tets Kashima, who was a professor of Asian American studies, and he said in California suicide was prevalent. There were just a lot of suicides. And the other thing was, few people talked about it. 

Robin Lindley: From some history I’ve read, such as The Nobility of Failure by Ivan Morris, it seems that suicide is honorable in Japanese culture and tradition. And in your novel, some characters see suicide as an acceptable way to cope with loss and depression. 

Dr. Lawrence Matsuda: That's the samurai tradition. If you dishonor your master, or yourself, you must die too. That led to a custom of ritual suicide. Hara kiri, which translates into “cut your stomach.” And that’s what samurai did. And my friend, [the artist] Roger Shimomura had ancestors who were famous for a double suicide. They stood face to face and stabbed each other simultaneously. So, they committed ritual suicide together. 

Robin Lindley: That's an elaborate way to go. You indicate that Hanae and your mother were influenced by both Japanese and Christian traditions. Were those traditions a source of your mother’s strength and resilience through the catastrophes in her life? 

Dr. Lawrence Matsuda: Yes. I think both of them helped her. She could call on Japanese tradition to deal with her stress if an American tradition did not help. So, she had a little more of an arsenal, if you will, or two toolboxes to pull from. However, some tools that helped her survive became counterproductive. Take the Japanese word shikatanganai. “It can't be helped.” That word helps you get through, but after a while it doesn't move you forward. 

Robin Lindley: Yes. “It can't be helped.” When I read that phrase in your book, it reminded me of Vonnegut’s refrain: “So it goes.” It can’t be helped seems a pessimistic adage rather than we can change this or we can do better. 

Dr. Lawrence Matsuda: It isn't really. Japan was a harsh land of starvation, earthquakes, and typhoons. When your house fell down, no one in the village wanted to hear you crying because their house fell down too. And so it’s shikatanganai, it can't be helped. It's just what happened. 

And in America, a rich country, not a poor country like Japan, there is no shikatanganai. Here, your house falls and you call your lawyer. You sue the city. You sue the architect. You sue your neighbors. But it's not that it couldn't be helped. You’ve got to sue somebody. And it's really an irony that, in a poor country, they accept their fate but in a rich country, they always want to contest what happens. Not always, but there’s a different feeling. So this Japanese value helped my mother and others cope with overwhelming forces. 

Robin Lindley: Maybe that's akin to the acceptance stage of grief. 

Dr. Lawrence Matsuda: Yes, you accept fate rather than get angry.

Robin Lindley: It’s a different perspective. I was interested in your influences, and you have mentioned the naturalist writers such as Frank Norris and his classic novel The Octopus. Naturalism concerns how characters deal with the forces of nature, the forces aligned against them, and you write beautifully of how your characters take on fate. Do you see the influence of writers like Norris in how Hanae deals with forces beyond her control and then, it seems, becomes a force herself? 

Dr. Lawrence Matsuda: Yes. The naturalists felt that the forces of nature superseded human ambition. Human beings have to deal with natural forces at work in this world and these forces often overcame individuals.  In The Octopus, the novel by Norris, the railroad was a force which had to reach from coast to coast to deliver grain to the starving people in India. So that was another force to deal with. And even though the ranchers resisted the railroad, they couldn't stand up to it because the force was more potent. It had to deliver the grain to feed the starving masses. 

If you look at our situation today, there are numerous outside forces at play. One is obviously the pandemic. The other is the political situation. And these forces that are largely out of our control. But in the novel, Hanae managed to survive the adverse forces and learned to surf the waves of the tsunami and become a force herself--not a capital letter F force like feeding the starving in India, but a small force that is filled with equality and social justice. 

We're in that kind of a situation now. The large forces out there can destroy us, but we must learn to use them and to survive them and become forces for good. And if many people get together and become forces themselves, they can become a large force, like a natural force, like the starving masses in need of grain. We need to persevere and make it to the other side and become forces ourselves.

Robin Lindley: And you have been a force for social justice and for democracy in your writing and in your activism and teaching.

Dr. Lawrence Matsuda: I have tried.

Robin Lindley: I’ve read about your many accomplishments. You’re too humble. You’ve written now about atrocious incidents and the resulting trauma, but you have also shared triumphs of the human spirit. Where do you find hope today?

Dr. Lawrence Matsuda: When I was a kid, I read all the Greek mythology in the Beacon Hill Library at grade three. And that helped me. I think that mythology is something like history. I recall that Pandora opened a box and unleashed all these horrible things. But the thing that was left in the box was hope. There is still hope.

Robin Lindley: Is there anything you’d like to add for readers?

Dr. Lawrence Matsuda: I'd like to speak to why the Japanese were incarcerated. Three presidents, Reagan, Bush Senior, and Clinton, said in the letters of apology. They said the causes were racial discrimination, wartime hysteria, and failed leadership. And I ask you to take a look at what we have now regarding racial discrimination. My hope is that things get better. For wartime hysteria, which was called propaganda then, and is now called fake news. I hope that the network that peddles fake news crashes and burns. And the last one, failed leadership. I hope that our failed leaders are repaired or replaced soon. So those are my three hopes. 

Robin Lindley: Those are powerful thoughts to end on. Thank you for sharing your thoughtful comments Dr. Matsuda, and congratulations on your moving new novel, My Name is Not Viola. It’s been an honor to speak with you.

 

Robin Lindley is a Seattle-based writer and attorney. He is features editor for the History News Network (hnn.us), and his work also has appeared in Writer’s Chronicle, Crosscut, Documentary, Huffington Post, Bill Moyers.com, Salon.com, NW Lawyer, ABA Journal, Real Change, and more. He has a special interest in the history of human rights, conflict, medicine, and art. He can be reached by email: robinlindley@gmail.com.

Dr. Lawrence Matsuda, a renowned Seattle writer and human rights activist, brings to life his mother’s travails, traumas, and triumphs in mid-20th century America in his debut historical novel My Name is Not Viola. 

 

 

 

 

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/blog/154401 https://historynewsnetwork.org/blog/154401 0
How an American TV Mini-Series Helped the Germans Acknowledge the Holocaust

Meryl Streep in Holocaust, NBC, 1978

 

 

In a fascinating book, Learning from the Germans: Race and the Memory of Evil (2019), philosopher Susan Neiman praises the German people for coming to terms with their country’s role in the Holocaust. The reckoning took time, Neiman reports. For a few decades after the Second World War there was not much public discussion or teaching about the subject in Germany. In the late 1970s, however, a significant change occurred. Germans began to deal more openly and frankly with the record of Nazi persecutions. 

A visitor to present-day Germany can find numerous examples of this “remembrance,” notes Susan Neiman. There is a memorial to the Holocaust at the center of Berlin and there are “Stumbling Stones,” small brass plaques around the city indicating where Jews and other victims of the Nazis lived before deportation. Exhibits about the Holocaust can be found throughout the country, and educational programs at Buchenwald and other concentration camps describe horrible practices at these sites. On the anniversaries of tragic events, such as Kristallnacht, Germany holds “public rites of repentance.” Neiman says Americans can learn how to confront their nation’s troublesome history of slavery and racial oppression by considering Germany’s progress dealing with unpleasant facts about the past.

Why did the German people’s curiosity and interest in the Holocaust surge in the late 1970s? Years ago, I discovered an important clue to this attitudinal change when conducting research for my 2002 book, Reel History: In Defense of Hollywood. Working on a chapter called “Impact,” I examined history-oriented dramatic films that influenced public opinion and behavior in significant ways. During that investigation, I came upon details concerning Holocaust, an American-made mini-series that NBC released in the United States in 1978. Subsequent programming in Britain, France, and Sweden attracted large audiences. The greatest buzz and public discussion took place in Germany. 

Holocaust is a four-part docudrama with mostly fictional characters. Among its stars is Meryl Streep, then a young actress in an early stage of an extraordinary career. At the center of the story is a kind and respected Jewish medical doctor, Josef Weiss, and his extended family. Weiss’s nemesis is Erik Dorf, an unemployed lawyer who joins the Nazis. Eventually Dorf becomes a deputy to Reinhard Heydrich, a principal leader of the “Final Solution.” By the end of the film, most members of Josef Weiss’s family perish. The story exposes viewers to major historical developments from 1935 to 1945, including the Nuremberg Laws, Kristallnacht, concentration camps, and the Warsaw ghetto uprising. 

When Holocaust became available for West German television in 1979, some German TV executives did not want to broadcast the film. One complained that it represented “cheap commercialism” in a soap opera format. A program director dismissed the production as typical Hollywood entertainment, “not quite real, not quite truth.” Despite the executives’ resistance, the program appeared on local TV stations and it became an instant hit. About half of West Germany’s population viewed some or all programs in the series, and many people in East Germany managed to watch it through antenna reception. About 30,000 viewers called television stations, requesting information. They asked: How could it happen? How many people knew? 

The film made a significant impact on German society. A few months after its broadcast, West Germany scrapped the statute of limitations on Nazi war crimes. Media attention to the film provoked a “historians’ debate,” leading scholars to clash on questions about lessons from the record of German society under the Nazis. Educational leaders responded to the public’s interest by developing new courses for schools.

Books and documentary films about the Nazis and the Holocaust appeared in Germany before 1979, but they did not excite the degree of curiosity and interest that the mini-series aroused. Several media analysts in Germany pointed to the dramatic film’s powerful effect. Viewers became emotionally attached to the characters. They were upset when seeing the Germans’ indifference to human suffering and seeing Jewish figures harassed or cut down in brutal actions. Previous reports about this tragic history provided only names and numbers, the analysts noted. This production displayed the impact of historical events in graphic form. The victims seemed like real people. Audiences cared about the Jewish characters’ fate.

Susan Neiman makes a good point in her book. Americans, now struggling to acknowledge their country’s history of racial oppression and wishing to do something about it can learn from Germany’s progress toward “remembering.” Yet the Americans’ recognition of evils from history is not as limited as Neiman suggests. “Hollywood,” the generic name for America’s vast film and video-based industry, has made some worthy contributions to humanitarian awakenings. Holocaust helped Germans to confront their troubled past, and in another notable example, Hollywood confronted Americans with demons from their history. 

Marvin J. Chomsky, the director of Holocaust, tugged at the heartstrings of American viewers through broadcast of an emotionally powerful drama on ABC Television in 1977, a year before the release of Holocaust. Roots, a mini-series about the experience of Africans and African Americans in slavery,attracted enormous audiences. The fourth and final program of Roots became the most-watched single episode of an American television show in history up to that point. Chomsky’s Roots did for the American people what Holocaust did for the Germans. The film aroused viewers to ask questions and seek information about a history that was less familiar than it should have been.

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177328 https://historynewsnetwork.org/article/177328 0
Fried Ice Cream and Steak: A Personal History of Hong Kong

 

 

The flight from Tan Son Nhut Airport in Saigon to Kai Tak Airport in Hong Kong on Cathay Pacific Airlines was short but sometimes rough, and windy, especially when landing. After clearing customs, on leaving the airport I hailed a cab. With my luggage in hand,  "Jimmy's, please," I said to the driver who knew the way without me telling him. It was usually my first stop even before checking in at my hotel. I needed a Western style meal or even the semblance of one after almost two months in Vietnam without a break from the daily grind of the war. Weaving through traffic we were fast on our way to Jimmy's Kitchen, a European style restaurant and a Hong Kong staple, a respite for Western journalists, contractors, military men, entertainers and people with money from everywhere. NBC News' policy for Vietnam War staffers gave us ten days off every two months to catch our breath and refresh. More often than not, Hong Kong was my primary destination, a place I visited many times and then lived in 1966, 1967, 1968 and 1969. 

Known simply as Jimmy's or, better yet, but rarely used, Jimmy's Kitchen, the restaurant sat on a darkened street in Kowloon down several concrete steps behind a dark wood door with only a single light at the entrance. Kowloon was far enough from the Central District of the city where most of the high-end shops, expensive restaurants, hotels and financial institutions were (and still are). Despite the huge demonstrations that are going on today, people tend to forget there were equally serious three-day demonstrations that turned into riots in 1966 over the price of a ferry ride. I was then in Saigon covering the Vietnam War when many people in Hong Kong protested the proposed rise of a few pennies, about 25 percent over the average cost of a daily ferry trip. With so many residents of Hong Kong on or near the poverty line, a few cents more for a ferry ride would seriously hurt their pocketbook. Because people were getting no response from the government, they demonstrated and ultimately rioted at the cost of one dead, almost 2,000 injured, and a like amount jailed. The riots lasted three days. The rioters lost and the ferry raised its rates. Quiet ensued. However the idea that demonstrations might work against government polices came into play. It is now apparent that those early protests were a precursor to today’s huge demonstrations in Hong Kong. 

One year later in 1967, there were sudden and unexpected (for some) demonstrations in the name of freedom from the rule of the British, then the colonial masters of Hong Kong. Mao Zedong, China's cultish dictator, inspired the riots with his brand of brutal communist ideology outlined in his Little Red Book, the bible for his philosophy. I vividly recall witnessing a reasonably peaceful demonstration on a Hong Kong main street with hundreds of Chinese students wearing starched white shirts, dark trousers and plastic sandals, marching together as one, brandishing the famous red vinyl-covered book. It was late afternoon, the sun not yet down, and the end of the workday. Most of the people on the crowded street were passive and stood silently saying nothing while a few others punctuated the march with a smattering of orderly clapping hands.

The late 1960s was also a time of terror in Hong Kong when makeshift bombs went off in doorways and on the streets of the city. The Red Guard, a Mao creation formidable in its control of Mainland China, was looking to establish a foothold in Hong Kong and set off many bombs in the city in an effort to oust the British. Though people died and suffered injuries when some bombs went off, what the Red Guard did was disruptive and created corresponding fear more than serious destruction. It was enough to keep the city on edge. One day I was in a crowd of people at lunchtime as it surrounded a small suitcase in the middle of a crossroad. Traffic stopped. Police were everywhere. The growing crowd was quiet but tense as we shuffled our feet and waited for the police to disarm what everyone in the crowd believed was a bomb. To a collective sigh of relief, the bag was empty and everyone quietly dispersed. Dinner was waiting. 

Today, with British colonialism long gone, ironically the many and continuing mass riots, mostly in the Central District and fomented by Hong Kong's younger generation, are for freedom from China in opposition to an increasingly repressive Mainland government. The riots regularly take place in the center of the city on the Hong Kong side of the island. Jimmy's, until it closed a few years ago, remained tucked away on a quiet street in Kowloon successfully serving its diverse patrons. 

I do not want to overly extol Jimmy's virtues. In a big room with low lights and dark wood paneling, with wide space between tables covered in starched white tablecloths, waiters clad in pajama style uniforms moved silently and served with no fuss. The restaurant had the look and feel of an exclusive British Club in Mayfair, London. The drinks were lavish and large. The Hong Kong brewed San Miguel or Japanese Asahi beer was never cold enough, but that was how the British liked their favorite drink, so we adjusted. The food was good, not great. The steaks were well aged. Chicken Kiev was consistent. Beef Stroganoff always satisfied. The onion soup with a heavy dollop of cheese was chewy and delicious. It was more than a restaurant, though, that catered to mostly Westerners. Instead of designating us as American or French or British, we all fell under the easier term, European. In times of war we accepted the designation with good will. 

The best part of the meal at Jimmy's was dessert, particularly its Baked Alaska, a treat to behold for its beauty and richness. I preferred the fried ice cream, an orb as big as a baseball consisting of freshly made vanilla ice cream wrapped in cake batter and quickly deep fried to create a sweet treat that I found nowhere else in Asia. It was always worth the trip. 

Coming from Saigon as I did at least twice a year the lure of Jimmy's was, strange at it may seem, its sameness. Even now if I close my eyes, I see the British club it wanted to be and for a few hours a night for dinner, or at lunch, it had become a peaceful break from the reality of war in Vietnam. Today, all semblance of peace is gone as Hong Kong seethes in its struggle between authoritarianism and freedom. 

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177329 https://historynewsnetwork.org/article/177329 0
Prop 16 and the "Chinese Virus" Bring Two Views of Asian American History into Conflict

Anti-Proposition 16 Car Parade, San Francisco Peninsula, August 2020

 

 

Nationwide Black Lives Matter protests over the past several months have rejuvenated fights against ongoing racism. A surge of harassment and assault against Asian Americans during the pandemic signifies the recurrence of xenophobia and racial animosity. The racialization of Covid-19 as the “Chinese virus” awakens the dormant yellow peril trope. Meanwhile, a group of Chinese Americans, mostly first-generation immigrants, have been organizing flag-flying, placard-displaying car rallies in the Bay Area and southern California, protesting Proposition 16—a ballot measure that aims to restore affirmative action in California.

Disregarding the structural inequalities that race and gender-conscious affirmative action seeks to dismantle, anti-Prop 16 protesters embrace a conception of equality that comprises two basic ideas: individual effort and colorblindness. They consequently consider racial and gender preferences inherently discriminatory. To comprehend this individualistic view underscoring mere surface equality requires one to trace the history of Chinese Americans’ struggle for equality. 

In the late 1960s and early 1970s, a group of Chinese American social activists, inspired by the civil rights movement, contested surface equality in American courts. Lau v. Nichols is a landmark case wherein limited-English-speaking students of Chinese ancestry in San Francisco alleged the denial of equal educational opportunity by the school district due to the lack of bilingual education. The United States Supreme Court ruled in 1974 that “there is no equality of treatment” without adequate bilingual education, the “effect” of which constituted discrimination. Mandating “different treatment,” the court directed the school district to “take affirmative steps to rectify the language deficiency” for racial minority students. As direct beneficiaries, Chinese American students enjoyed the benefits of these structural improvements.

In a more open and equal social milieu, a growing yet diverse Chinese American community emerged. The ethnic Chinese population almost doubled in the 1970s as a result of the Hart–Celler Immigration Act of 1965. This more liberal immigration law favored immigrants seeking family reunification and those with professional occupations. Many Chinese Americans with professional skills and capital rode the wave of opportunity in the post-civil rights era to achieve socioeconomic success, whereas the majority of new immigrants who came for family reunification struggled in urban poverty. Public perceptions of a successful minority group rising from historical discrimination overlooked the vast intragroup socioeconomic divisions. 

Re-emerging in the 1980s, the model minority myth portrayed Asian Americans as an example of self-sufficiency and individual achievement. In contrast to the structural interpretation, the cultural rhetoric that emphasized familial and cultural attributes dominated the public view. Many middle-class and wealthy Chinese families welcomed the illusory rhetoric because it fit in well with traditional values and beliefs that the parents had carefully maintained to nurture their children. This cultural discourse functioned as a powerful force informing Chinese Americans’ understanding of equality. 

The positive stereotypes soon backfired. Public perceptions of Asians as disproportionately successful in American society drove a growing amount of anti-Asian resentment. The once positive portrayal of Asian students was repositioned to depict them as monotonous and lacking character and leadership. In order to curtail rising Asian American and declining white enrollments, UC Berkeley made several undisclosed admissions policy changes in the mid-1980s that disfavored Asian applicants. After discovery and investigation by a coalition of Asian American community organizations and further pressure from state government agencies, Berkeley Chancellor Ira Michael Heyman apologized twice and publicly acknowledged the university’s discriminatory policies. 

With the disputes barely settled, anti-affirmative action politicians moved in quickly to exploit the Berkeley situation by targeting race-based policies in general. Historian Dana Takagi argues that the political manipulation shifted the focus of discourse from anti-Asian racial discrimination to the faults of affirmative action. Elaine Chao, then U.S. Deputy Maritime Administrator, wrote a 1987 op-ed in Asian Week, connecting the racial quotas against Asian Americans in Berkeley’s admissions process to the university affirmative action programs for underrepresented minorities. Other conservative politicians and intellectuals joined the fray to reinforce the conflation. 

The heightened conflation of anti-Asian racial discrimination and race-based policy manifested in Ho v. San Francisco United School District (SFUSD). The SFUSD had implemented court-mandated racial caps in public schools to achieve school integration since the 1980s. In the 1990s, the racial caps’ negative impact on ethnic Chinese students, who faced the highest score cutoff among all racial groups to qualify for admission to a top alternative high school, became more pronounced. Several ethnic Chinese students filed a class-action suit against the school district alleging that the imposed racial caps constituted racial discrimination. The lawsuit found impassioned support from anti-affirmative action Chinese Americans who ignored the mandatory nature of public education and equated the racial caps with affirmative action. This resentment of race-based policies dovetailed with the conviction in the cultural rhetoric, forging a specious argument among some Chinese Americans in support of colorblind policies. 

This stance has resonated with many newly arrived Chinese immigrants. These well-off suburban dwellers, most of whom work as professionals, rushed to adopt a misguided position that suppresses race as an essential element in American social relations. Even the pandemic failed to shake their belief in the model minority myth and subdue their passion for protesting Prop. 16. Little wonder that the car rally organizers are part of a broader coalition that supports a recent suit against Harvard, whose political repercussions recall the Berkeley admissions controversy. 

The past never vanishes. But what the past really entails for our present depends on an accurate and nuanced interpretation of it. The revival of the racist and anti-immigration narratives around the “Chinese virus” attests to the illusion of a colorless society. Deeply rooted in the history of the United States, racism and xenophobia never go away. In a society where people still believe in racial hierarchies, even a “model minority” group runs the risk of being accused by the racial superior as a threat, whether in the form of university overrepresentation or disease carrying. Over the years, race-conscious policies have induced profound change in institutions, bringing about structural improvements. Nevertheless, until racial hierarchies are shattered, racial discrimination persists regardless of how a racial minority group is ranked along the racial hierarchy.

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177331 https://historynewsnetwork.org/article/177331 0
Roundup Top Ten for September 11, 2020

Our Long, Forgotten History of Election-Related Violence

by Jelani Cobb

"A weather forecast is not a prediction of the inevitable. We are not doomed to witness a catastrophic tempest this fall, but anyone who is paying attention knows that the winds have begun to pick up." 

 

Think The Trump Tapes Are Worse Than The Nixon Tapes? Think Again.

by Leonard Steinhorn

Recordings of the President reveal not only racial bigotry but a cynical indifference to the rule of law and a belief that any means were justified to prevail over political adversaries.  

 

 

This Republican Party Is Not Worth Saving

by Tom Nichols

"The hardening of the GOP into a toxic conglomeration of hucksters, quislings, racists, theocrats, and cultists is already happening. The party gladly accepted support from white supremacists and the Russian secret services, and now welcomes QAnon kooks into its caucus. Conservatives must learn that the only way out of “the wilderness” is first to vanquish those who led them there."

 

 

Neoliberal Hong Kong Is Our Future, Too

by Macabe Keliher

While orthodox economists like to point to Hong Kong as an ideal free market, the social consequences have been disastrous. Inequality is rising, wages are declining and working hours increasing, overall economic opportunity is dwindling, and housing is so unaffordable that office workers sleep in McDonalds. Is it any wonder that the streets are now burning?

 

 

The Supreme Court’s Starring Role In Democracy’s Demise

by Carol Anderson

The Supreme Court today repeats the shameful actions of the courts in the 1890s, which gave judicial cover to state laws explicitly designed to disenfranchise Black voters, by accepting bad faith arguments that the laws in question were race-neutral. 

 

 

Reform the Police? Guess Who Funds My State’s Officials

by Miriam Pawel

Translating protest into reform depends on breaking the influence law enforcement unions exert on state legislators, including through campaign contributions.

 

 

Trump’s Law-and-Order Campaign Relies on a Historic American Tradition of Racist and Anti-Immigrant Politics

by Austin Sarat

"Throughout this nation’s history, appeals to law and order have been as much about defending privilege as dealing with crime. They have been used in political campaigns to stigmatize racial, ethnic and religious groups and resist calls for social justice made by, and on behalf of, those groups."

 

 

In 2020, Voting Rights are on the Ballot

by Peniel Joseph

Black citizenship remains the best yardstick to measure the nation’s democratic health, and even before the coronavirus pandemic, the Black vote in large parts of the country remained imperiled.

 

 

Trump’s 2020 Playbook Is Coming Straight From Southern Enslavers

by Elizabeth R. Varon

In arguing that radical protesters endanger U.S. law and order, Trump is echoing the attacks leveled by Southern enslavers against abolitionists.

 

 

Covid-19 Has Exposed The Consequences Of Decades Of Bad Public Housing Policy

by Gillet Gardner Rosenblith

Poor and economically precarious Americans are at risk of eviction in the COVID-19 crisis because American policymakers have spent decades rejecting a public role in providing decent housing outside of the market system. 

 

]]>
Thu, 01 Oct 2020 10:22:09 +0000 https://historynewsnetwork.org/article/177312 https://historynewsnetwork.org/article/177312 0