Joseph
S. Lucas and Donald A. Yerxa, Editors
Randall
J. Stephens, Associate Editor
Historically
Speaking: The Bulletin of the Historical Society
July/August
2006
Volume
VII, Number 6
--David
S. Brown, "Déjà vu All Over
Again: Rereading Richard Hofstadter by the Light of the New Right"
--Bruce
Kuklick, "How the Kennedy School of Politics Was Born"
--Beyond
the Niebuhrs: An Interview with Robert Orsi on Recent Trends in American
Religious History
Conducted
by Randall J. Stephens [full text]
--Jerry
Brotton, "When Art Meets History: The Sale of King Charles I's Art Collection"
--David
E. Nye, "Why Technology Matters"
--Vernon
W. Ruttan, "Is War Necessary for Economic Growth?"
--Trevor
Burnard, "Only Connect: The Rise (and Fall?) of Atlantic History"
--Evan
Mawdsley, "'Victors Are Not Judged': Byways on Stalin's Road to Berlin"
--Pamela
Kachurin and Ernest A. Zitser, "After the Deluge: Russian Ark and the Abuses
of History"[full text]
--Bertram
Wyatt-Brown, "Conjectures of Order: A Review Essay"
--"Historians
in the Midst of Catastrophe: Reflections of the University of New Orleans's
Department of History after Katrina" [full text]
Historically
Speaking: The Bulletin of the Historical Society
July/August
2006
Volume VII, Number 6
Déjà
vu All Over Again: Rereading Richard
Hofstadter by the Light of the New New Right
David S. Brown
We may nod respectfully at Richard
Hofstadter’s work and casually invoke his name to lend a certain gravitas
to our own, but something is lost in the translation if we approach him
only as the late, great past master/public intellectual of his era.
For perhaps no historian today addresses so insightfully the rightward
shift of contemporary politics in our era as Hofstadter. The child
of a mixed gentile and Jewish parentage, Hofstadter grew up in the multiethnic
city of post-World War I Buffalo. His formative experiences and ideological
commitments were shaped by two shattering historical events, the Great
Depression and the Second World War. The purposeful intervention of the
central government during these years to combat both economic upheaval
at home and Nazi power abroad struck Hofstadter as the foundation of a
fresh liberal tradition in American life. The promise and complexities
of the new politics preoccupied this distinguished Columbia historian through
the balance of a prolific career.
The old liberalism, Hofstadter insisted,
had to go. It embraced, he believed, a host of archaic or self-defeating
convictions—isolationism, unregulated capitalism, and individualism—rooted
in the nation’s preindustrial past. As an expression of political culture,
the old liberalism had sustained a farmers’ republic for decades, during
which generations of Americans believed that the concentration of state
authority constituted the greatest threat to their liberties. Even after
the industrial crisis of the 1890s began to shake this system, critics
like Hofstadter were quick to note that despite the populist and progressive
impulses that informed the pre-New Deal “age of reform,” original approaches
to government activism (protection of workers, regulation of important
areas of the economy, and social welfare/security) had yet to be worked
out.
To American intellectuals in 1940,
it seemed an even bet that the paternal state pioneered by the New Deal
would not survive the return of prosperity. That year, Lewis Mumford published
an anxious essay in The New Republic decrying “The Corruption of
Liberalism.” His analysis prefaced by a few years Hofstadter’s more elaborate
work on the subject. “Undermined by imperialism and monopoly” and acquiescent
in the expansion of Soviet Russia and Nazi Germany, “the record of liberalism
during the last decade,” Mumford wrote, “has been one of shameful evasion
and inept retreat.” An ideological breech had opened in America and the
1940s proved to be a critical decade for the reconstruction of a new reform
tradition that could break cleanly from the politics of the past. As a
young historian, Hofstadter’s first books, Social Darwinism in American
Thought (1944) and The American Political Tradition (1948),
alertly challenged the ideas and personalities that sustained the old consensus.
The latter work in particular—a biographical study that replaced hero worship
with a nuanced and devastating assessment of the pre-New Deal party systems—captured
the imagination of postwar students. No doubt the book owed its popularity
in great part to contemporary circumstances. The ideological uncertainty
of the 1930s gave way in the 1940s to decisive liberal leadership that
combined social welfare reform at home with a commitment to contain Soviet
power abroad. “In the United States at this time,” Hofstadter’s colleague
Lionel Trilling confidently wrote a decade after Mumford’s bleak missive,
“liberalism is not only the dominant but even the sole intellectual tradition.”1
Hofstadter never believed this .
. . .
David S. Brown is associate professor
of history at Elizabethtown College and author of Richard Hofstadter:
An Intellectual Biography (University of Chicago Press, 2006).
<top>
Join
the Historical Society and subscribe to Historically Speaking
Historically
Speaking: The Bulletin of the Historical Society
July/August
2006
Volume VII, Number 6
How
the Kennedy School of Politics Was Born
Bruce
Kuklick
In
the late 1940s a number of younger academics sought to make their careers
in a way not wholly oriented to traditional university departments. These
students wanted to be regarded as prudent, hands-on specialists in foreign
affairs—knowledgeable about the grim causes of World War II and experts
in the battle against the Soviet Union. They frequently described themselves
as “Realists.” Institutes for the study of policy often appeared as the
best venue to undertake their work. At a number of universities new centers
for such study sprang up, replacing older and pokier bodies of minor importance
that had haphazardly analyzed international events from the 1930s on. The
new and revived entities were attempting to duplicate the success of the
early air force think tank, RAND. But the association with the military
was thought to compromise the independence of RAND, although many of the
people who consulted for it had positions at schools of higher learning.
In the late 1940s and early 1950s scholars at Chicago, Columbia, Johns
Hopkins, MIT, Princeton, and Yale jockeyed to build organizations that
would put RAND conceptualizations to work in a collegiate environment.
In
the 1950s and early 1960s the Harvard department of government was out
of step with developments at these leading universities. Its name suggested
commitment to something slightly different from “political science.” While
Harvard’s prestige insured that its scholars were well represented in policy-making
circles, two older scholars, William Y. Elliott and Carl J. Friedrich,
who had credentials as political theorists, still dominated the department.
While more freestanding scholarly entities on the campus offered younger
faculty more fashionable niches, officials regularly thought of closing
the most significant of these, the Harvard School of Public Administration—the
Lucius Littauer Center. The sleepy center had existed since 1935 but was
a stepchild in Cambridge, an administrative unit that faculty in the departments
of economics and government jointly ran. While the connection to Harvard
had made Littauer more than respectable, training in public administration
lacked the excellence associated with Harvard’s schools of law and medicine.
As one evaluating committee put it, the master’s degree that the center
awarded had “never been entirely satisfactory” and the students “not fully
up to the standards of the Arts and Sciences departments.” It contributed
little to the field of security studies that scholars in places like Princeton
and Hopkins had established some fifteen years before.
The
situation changed at the end of 1963 when the family of John F. Kennedy
initiated plans for the presidential library that would house material
from the administration of the recently assassinated leader, immediately
elevated to a mythic existence . . . .
Bruce
Kuklick is Nichols Professor of American History at the University of Pennsylvania.
His latest book is Blind Oracles: Intellectuals and War from Kennan
to Kissinger (Princeton University Press, 2006).
<top>
Join
the Historical Society and subscribe to Historically Speaking
Historically
Speaking: The Bulletin of the Historical Society
July/August
2006
Volume VII, Number 6
Beyond
the Niebuhrs: A Conversation with Robert Orsi on
Recent
Trends in American Religious History
Conducted
by Randall Stephens
Robert
Orsi is the Charles Warren Professor of the History of Religion in America
at Harvard University. Orsi’s work on Catholic devotionalism, and what
some call “lived religion,” has helped reorient religious history in the
U. S. to the common men, women, and children who practiced their faith
from day to day. His most recent work, Between Heaven and Earth:
The Religious Worlds People Make and the Scholars Who Study Them (Princeton
University Press, 2005), combines autobiographical insight with keen scholarship.
Peering into the worlds of devotees and academics, Orsi explores Catholic
notions of suffering, the presence of the sacred for believers, and even
scrutinizes serpent handlers and the observers who study them. He
is currently completing a book on growing up Catholic in the 20th-century
U. S. In that work he plans to explore the religious worlds children
made and how they formed their beliefs in creative, imaginative ways. In
May 2006 Randall Stephens, associate editor of Historically Speaking, spoke
to Orsi in his office at Harvard University.
Randall
Stephens: Would you comment on the concept of devotionalism that is so
central to much of your work?
Robert
Orsi: What intrigues me about devotionalism—and one of the reasons
I find it so interesting to study historically—is that it’s a charged field
where institutional imperatives intersect with people’s own efforts and
desires to create a religious world. There’s a powerful interplay at work
in devotional practice, and the results are not predictable. For instance,
the cult of the Virgin Mary does not in any simple sense enforce an image
of docile femininity on living women among 19th and 20th century Catholic
women. Women appropriate the Virgin Mary. They pray to her; they make her
part of their lives. She becomes a mother, sister, grandmother to them,
and at that point, they use her to authorize their lives, to think about
their lives in different ways, to make certain changes that might otherwise
have been unthinkable. So I don’t like the simple grid of empowerment/disempowerment.
Devotionalism offers a much more subtle ground where people live real,
necessarily limited and contradictory lives.
Stephens:
In Thank You, St. Jude you explore the combination of institutional
and popular dynamics in Catholic devotionalism. You argue specifically
that St. Jude became the patron saint of lost causes in part because devotional
promoters in Chicago sold St. Jude to a mass audience. But there were also
women who made the saint their own.
Orsi:
I see this as the dialectic of devotionalism. Women didn’t make up the
devotions of St. Jude. These were the creation of priests in Chicago who
needed money for their new church in a poor Mexican neighborhood. Devotions
were a rich source of income in the mid-20th century. So women inherited
Jude. But once Jude was there, then women came and made Jude part
of their lives, and in turn Jude helped them live these lives.
For
example, the Catholic press in the 1930s and 1940s insisted that women
should not work outside the home, that the working mother was a bad mother.
Yet at the same time, women were praying to Jude in their devotions to
him to help them find jobs because they needed to support their families.
That’s what I mean about devotionalism as the ground on which women could
live their lives as they found them.
Stephens:
Scholars like Eugene McCarraher have criticized the fusion of consumerism
and Christianity in America. What are your thoughts on the subject?
Orsi:
McCarraher is right. It’s there all along. If you look back, for
example, to Henry Ward Beecher’s appearances in advertisements for soap
and flour, you can see this at work, as well as in his sermons. Consumerist
faith represented an effusive, romantic Christianity, one consumed to express
and even to constitute one’s interiority. Modern American capitalism was
able to make use of the religious cultures at its disposal, and to transform
them in the process.
Stephens:
In a review of your latest book, Between Heaven and Earth: The Religious
Worlds People Make and the Scholars Who Study Them (Princeton University
Press, 2005), John T. McGreevy criticizes your work for not giving enough
credit to the bishops, priests, nuns, and lay leaders who changed the church
after Vatican II in the 1960s. How do you respond?
Orsi:
I don’t think we have the historical materials yet to make generalizations
about how the complex transformations of the era took hold or didn’t. Take
the trajectory of Reformation studies as an analogy here: eventually understanding
the processes of reform demanded town-by-town, if not street-by-street,
studies. Something like this will be necessary to approach the effect of
the Council’s reforms, because there were regional variations, ethnic and
social class variations. So local studies of how things developed after
the Second Vatican Council—the liturgical changes, ethical rethinking,
the serious theological reformulations the Council offered—are still necessary.
We need some account of how these matters were presented to laity in different
parts of the country by different clergies. The northern Midwestern experience
of the era was different, for instance, from that in the old industrial
Northeast. We’re not at a place yet to make generalizations about the reception
of the Second Vatican Council, I don’t think.
Having
said that, I’ve been doing soundings in local history as I research my
book on growing up Catholic in the 20th century. Again, as in past
work, I’ve been using ethnography as well as history, and the people I’ve
spoken to around the country in what I’ve been calling memory groups have
very much wanted to talk about their experience before and after the Council.
Time after time, I was told that church officials presented the mandated
liturgical and ecclesiastical changes without explanation, without discussion
to among lay people. I don’t know of a lot of instances where pastors held
public conversations to explain, “these are the changes, let’s talk about
our feelings about the changes, and our understandings.” This was not the
culture of Catholicism at that time. Priests were not inclined or trained
to go out to the laity and say, “You know, the Friday meat prohibition
is now being lifted. Let’s talk about what you think about that and what
you feel.” Vatican II and its aftermath represented the changing
nature of Catholicism in the United States and worldwide. It marked a deep
revolution in how people experienced their sacred world.
Stephens:
How well do you think American religious historians understand the American
religious experience in general?
Orsi:
American religious history, as it is practiced in the universities today,
is insistently committed, consciously or not, to Niebuhrian neo-orthodoxy
as its moral vision, and this profoundly influences the historiography.
We celebrate those aspects of American religious history that are admirable
from the neo-Orthodox perspective. We don’t even ask, “Was the rise of
neo-orthodoxy a good thing?” It’s arguable that as mainline Protestant
seminaries in the U.S. became neo-orthodox strongholds, as they almost
all did, obsessed with the idea that Christian integrity demanded a Christ
against culture, Protestant liberal clergy lost the capacity effectively
to participate in and to speak to American culture. Was this
a good thing? But neo-orthodoxy as a lens for American religious historiography
prevents us from asking these kinds of questions. There are exceptions—I
think of Marie Griffith’s work on Christian diets or Larry Moore’s study
of Christian entertainments, and so on, studies done without the edge of
moral condemnation that otherwise so characterizes the field.
Beyond
that, how much does neo-Orthodoxy or Nieburhian realism tell us about American
religion? I tell my students that if they want to understand the
history of modern American religion, they have to look at figures like
Henry Ward Beecher and Peale. I say this without then going on to impose
a neo-Orthodox moral judgment on such figures. Beecher is more important
than the figures we tend to lionize because, in fact, American Christianity
became his Christianity.
Stephens:
There are some in the field who have turned attention to other areas. One
thinks of historians like George Marsden, Mark Noll, and Nathan Hatch,
for example. Marsden has also called historians who happen to be Christian
to acknowledge or embrace their faith perspectives. He has argued that
religious viewpoints “ought to be granted a fully legitimate place in the
mainstream academy so long as they prove themselves academically worthy
in the same way that other points of view do . . . .” Would you agree with
him?
Orsi:
As much as I respect his work, I’ve never quite understood what George
Marsden is talking about when he criticizes the allegedly normative or
exclusive secularity of the American university. First of all, there’s
no lack of respect for evangelical historians in the academy. He himself
is surely proof of that. And this is true of all the historians who are
in the evangelical circle: Mark Noll, Joel Carpenter, Grant Wacker, among
others. I don’t see any exclusion of these historians from any conversation
on the grounds of their insider status. So too Richard Bushman is a distinguished
and widely admired Mormon scholar of Mormonism. So it strikes me that there’s
a little bit of the “evangelical as victim” thing going on here. I don’t
think there’s any truth to that claim. If a historian does his or her work
well, I truly don’t see what the issue is.
When
I had just gotten out of graduate school and was in my first teaching job,
I was given the unhappy but necessary assignment of speaking with an evangelical
Ph.D. student who wanted to argue that the unpredictability of the first
Great Awakening, its odd patterns and surprising conversions, pointed to
the reality that the Holy Spirit was the cause of the revival. I had to
tell him that this was an unacceptable interpretation, and if he persisted
he wouldn’t be allowed to continue in the program. Now, if this is
what evangelical critics are talking about, if that’s what it means to
bring one’s religious vision into the classroom, then I think this position
is completely inappropriate. You can’t get an academic degree based on
notions of divine causality given that our critical inquiry is necessarily
limited to the human.
If
the thrust of the evangelical critique is that we need to give a rich and
nuanced account of people’s religious worlds, of their religious thought,
and of the religious motivations that move them—move them even apart from
political or economic considerations—I can’t think of a single American
religious historian who would dissent. So, Marsden’s argument strikes me
as criticism without an object in view.
Stephens:
What do you make of Stanley Hauerwas’s critique of religious studies departments,
that they make campuses inhospitable for people who practice their faith?
Orsi:
I find it tendentious. I have yet to see any evidence for this, and surely
such statements, to be acceptable in a university setting, require evidence.
Religious studies departments—and I am familiar with many of them—are thoughtful
venues for the study of different religions. I have never heard of a student’s
religious question being rudely dismissed in a classroom. It is the case
that critical studies of religion raise questions about religious worlds
that might make some practitioners uncomfortable. But professors are generally
quite willing to talk about even this discomfort in classrooms and in office
hours. Is Hauerwas calling for a faith that is not up to intellectual scrutiny?
Clearly not, given his work in general. Again, I don’t get the point
of such critical attacks.
Stephens:
I believe the critique is that the pluralism of the academy doesn’t make
room for people who are exclusivist.
Orsi:
But
what does this mean in practice? If people are exclusivists and think that
Hinduism is the devil’s work, would that mean they would or should dismiss
the study of that subject, that they could go into a classroom
and teach
Hinduism as demonic? How would this advance knowledge? On the
other hand, plenty of foundational work in the study of religions other
than Christianity was done by deeply committed and exclusivist Christians.
The question is how Christians deal responsibly with their Christianity
in an environment of open critical inquiry, in which all questions are
fair game. And regardless of what some evangelicals say, the same
is asked of all professors, whatever their personal commitments.
A photographer
friend of mine, Rich Remsberg, produced a fine book of photos and text
on Pentecostal motorcycle riders called Riders for God. I got to
visit the bikers’ church a few times with Rich. This was in rural
southern Indiana. The pastor came over to me one night during a little
conversation hour after one of the services and asked, “So what do you
do?” I told him that I was the chair of the religious studies department
at Indiana University. “Well, what do you teach there?” he wanted to know.
I said, “We teach all the world religions, Christianity, Hinduism, Buddhism,
and so on.” And he teased me, although he was serious, “you teach those
other religions to show how they’re wrong, right? How Christianity is the
only rue religion?” That’s okay for a church, but not for a university.
Stephens:
In Between Heaven and Earth, you remark that it is a challenge to
write about figures of “special power”—saints, demons, ancestors, gods,
ghosts—as agents in history. Could you elaborate on how you go about that?
Orsi:
I
think that the religious studies community does a pretty good job of thinking
about these questions. Dipesh Chakrabarty’s Provincializing Europe:
Postcolonial Thought and Historical Difference issues a strong challenge
from subaltern studies to the historical guild, asking us to think about
how such supernatural figures held so closely by various peoples act in
history. I think religious historians have done a better job of writing
about angels, demons, spirits, etc. when they’re not looking over their
shoulders apprehensively at what secular historians might say about this.
It’s important to see the ways in which figures like St. Jude are not puppets
being manipulated by the clergy or by women in the circumstances of their
lives. They are, in fact, rich imaginative creations that acquire a vivid
life of their own and in as important and historically relevant sense break
free of their creators.
Stephens:
Certainly, someone who is devout would say, “These are not ‘creations.’
They’re independent beings that exist in their own right.”
Orsi:
I’m
really interested in the question of how these figures become real
in particular times and places. My current work helps me understand how
children relate to these figures, and this in turn is helping me find ways
of thinking about the historical and cultural reality of the really real,
since the cast of a culture’s imagination begins in its work with children.
How do the apparitions at Lourdes or the religious manifestations at the
1906 Pentecostal revival on Azusa Street in Los Angeles become real for
devotees? Or what historical sense can we make of the fact that people
claimed to see Jesus there during one of Aimee Semple McPherson’s revivals?
For the faithful, there’s something ontologically unsurprising going on.
But how can we historians talk about the human historical and cultural
dimensions of such things?
Stephens:
The new work that you’re doing is on children. It’s a topic that, like
the study of imaginative religious beings, doesn’t figure prominently in
historians’ work. What drew you to the topic of children?
Orsi:
Historian
Steven Mintz writes that children are largely invisible in history. They’re
doubly invisible in religious history. Often they show up in religious
contexts as objects of dread and fear. Periodically in American religious
history there have been moments of high anxiety about the risks that the
culture is posing to children, or the dangers of American life to children,
or of the dangers that children are posing to society. And there are certain
incidents, which could easily be recast as children’s events, like the
first, colonial revivals in which children were active participants. As
I write in my book, all the Marian apparitions of the 19th and early 20th
centuries were to children. Still, children have been really invisible
in American religious history and in religious history generally.
So this challenge drew me. But in terms of American Catholicism,
which throughout the 20th century was not only preoccupied with the religious
formation of children but had built one of the most extensive institutional
structures to achieve this, it is impossible to understand the life of
the modern Catholic church in the U.S. without understanding it as the
creation of adults and children in relation to each other.
Stephens:
In the religious communities you study how have adults understood children
and childhood?
Orsi:
I think part of the problem is that adult-children relations in religious
settings have not been approached with the full complexity and historical
specificity that they require. There’s the idea that adults pass on
their religious beliefs to children, but this is always a fraught process,
the “passing on.” That’s particularly the case if we’re talking about immigrant
families or migrant families. I wrote an article many years ago in the
Journal
of Social History about the “religious fault lines” that I saw between
Southern Italian immigrants and their American children, and the question
has been on my mind since. To what extent has modern American Catholicism
been shaped by this particular fault line? But there are real problems
with finding children in history. You have to use a historical methodology
akin to that of historians using Inquisitional documents to reach the lives
of the people the Inquisition targeted and who appear in documents refracted
through authority. I have been using in part works written for or about
children by adults, including prescriptive literature and Catholic children’s
comics.
Stephens:
It sounds like the problem of using European colonial sources to study
Native Americans.
Orsi:
Right. You’re reading Jesuit missions documents to understand Native American
religious cultures. That’s why I use these memory groups, around
the country, getting groups of people together in different parts of the
U. S. I’ve been talking to people over 50 about their memories of growing
up and what their childhoods were like, and then setting those memories
in relationship to whatever printed sources I can find.
Stephens:
What is your impression of the religious worlds that these children created
or inhabited in pre-Vatican II America?
Orsi:
One
way of approaching the history of a particular culture of childhood is
to begin with the anomalous, for instance with a story that you can’t imagine
a nun actually telling or a religious idea that is completely idiosyncratic.
For example, I came across a widespread children’s notion that God made
priests forget everything that they just heard in confession. Kids made
that up some place at some time, working with what the nuns told them about
the seal of the confessional. They told each other. This idea became an
item for historical reflection, raising the question of what was it that
children were so concerned about that they would have generated this story?
What does it disclose about children’s fears or desires?
Stephens:
How will your work on the subject expand our knowledge of American religious
history in general?
Orsi:
I
agree that that’s a question a historian of childhood has to ask. How does
this contribute to the general historiography? So to take one example,
there was a real pressure on American priests from the 1920s to be able
to speak to children, to be able to address children in language they could
understand, in part because there were so many children around, because
Catholics insisted that their children go to church from a very early age.
Priests became specialists in “boyology,” and particular priests developed
reputations, in some cases nationally, as being good with boys. Modern
Catholic notions of clerical deportment, in other words, the style and
delivery of clerical authority, developed in part in relation with
children. Putting children back in the picture raises suggestive questions
about the actual making of lived religious worlds.
<top>
Join
the Historical Society and subscribe to Historically Speaking
Historically
Speaking: The Bulletin of the Historical Society
July/August
2006
Volume VII, Number 6
When
Art Meets History: The Sale of King Charles I’s Art Collection
Jerry
Brotton
There
still remains a tendency within historical inquiry to marginalize visual
artifacts, or at best to see them as passive reflections of larger social
processes (think of how often Van Dyck’s paintings are used as illustrations
in this way). This tends to limit our understanding of how individuals
in history used artifacts to make sense of their world. It also restricts
our appreciation of how objects like pictures cross social, political,
and cultural boundaries and are involved in the making of history (especially
in an era that predates the public galleries and museums of the 18th and
19th centuries).
I came
up against this problem while examining the dispersal of King Charles I’s
art collection in the years 1649-1654. Everyone who works in the field
thinks they know the story of the so-called “Sale of the Century,” which
was also the title of Jonathan Brown and John Elliott’s 2002 Prado exhibition
and Yale University Press edited collection. Yet what struck me as I began
my preliminary historiographical reading into the subject was that nobody
had actually undertaken a systematic study of what happened to Charles’s
art collection from the creation of the Rump Parliament through to the
establishment of the Cromwellian Protectorate.
Compounding
this problem is the accretion of myth that grew up around the Commonwealth
sale, especially after the successful restoration of the monarchy in 1660.
Royalists and later historians of art were horrified at what they regarded
as the sale’s cultural vandalism. Writing in 1685, William Aglionby angrily
denounced the sale, insisting that “had not the bloody-principled zealots,
who are enemies to all the innocent pleasures of life, under the pretext
of a reformed sanctity, destroyed both the best of kings, and the noblest
of courts, we might to this day have seen these arts flourish amongst us.”
By this time Charles I was being recast as a saint and martyr of the royalist
cause, a connoisseur ahead of his time, brought down by iconoclastic philistines.
Eighteenth-century historians embellished the myth, castigating the Commonwealth
for destroying what they regarded as the first flowering of sensibility
among the polite arts in England. Horace Walpole claimed that throughout
history “the mob have vented their hatred to tyrants on the pomp of tyranny.
The magnificence the people have envied, they grow to detest, and mistaking
consequences for causes, the first objects of their fury are the palaces
of their masters . . . . This was the case in the contests between Charles
and his parliament.” Like many other connoisseurs of his day, Walpole viewed
the consequences of the sale through the prism of prevailing 18th-century
conventions of taste and sensibility. The Victorians romanticized the aesthetic
Charles; official royal publications and Web sites continue to describe
the Commonwealth sale as a national tragedy . . . .
Jerry
Brotton is Senior Lecturer in Renaissance Studies at Queen Mary, University
of London. His latest book, The Sale of the Late King’s Goods: Charles
I and His Art Collection, (2006) is published in the UK by Macmillan.
<top>
Join
the Historical Society and subscribe to Historically Speaking
Historically
Speaking: The Bulletin of the Historical Society
July/August
2006
Volume VII, Number 6
Why
Technology Matters
David
E. Nye
Last
year a bright student came to me to talk about her interest in the history
of bells and timekeeping in Britain from ca. 1300 until the 18th century.
She had some interesting materials about the social history of bells and
hoped to write a dissertation on this topic. But when I inquired, she had
no answer to a whole series of crucial questions: where the centers of
bronze bell manufacture had been; where the ores were smelted; what was
their composition (23% tin and 77% copper was best to avoid cracking);
how bells were transported, hoisted into position, and tuned; and whether
bells became cheaper to make, transport, and install over time (which could
help explain why they became more numerous). Her otherwise excellent training
had not prepared her to think about such matters, and village bells were
just part of the historical landscape. How they got to the bell tower was
not part of the story she had planned to tell.
Until
quite recently, a good deal of historical work proceeded on similar assumptions.
For generations, historians wrote about slaves growing rice in the Carolinas
without asking how Englishmen, with no history of growing rice, had reshaped
the swampy coastal land, introduced the right agricultural technologies,
and taught slaves how to plant, care for, and harvest this new crop. Judith
Carney’s seminal Black Rice showed that planters imported rice plants
and slaves accustomed to tending them from what is now Sierra Leone. Tools,
plants, and technical knowledge, like bells, came from somewhere.
Technologies
are not marginal to knowing the past. People have woven them into every
aspect of experience, and it can be perilous to ignore them. Yet some historians
limit their definition of technology to steam engines, automobiles, airplanes,
and other large machines that have emerged since industrialization. They
do not realize they are dealing with technology when they write about the
home, the landscape, the city, the workplace, transport, energy systems,
and cultural reproduction, to cite just a few examples. As this list suggests,
however, historians of technology routinely include in their field the
working systems of material culture, from ancient tool making to the microchip.
Few are interested in objects in isolation; most argue that technologies
are socially constructed.
For
researchers unfamiliar with this field, the following sketch may be useful.
It is an inherently interdisciplinary field, shading off at its edges into
social history, material culture, museum studies, business history, labor
history, engineering, the history of science, literary history, the arts,
and area studies programs. The Society for the History of Technology (SHOT)
began in the 1950s as a crossroads where scholars from all these areas
met. SHOT members established their credentials with an innovative journal,
Technology and Culture, developed doctoral programs by the 1970s, and grew
into an international community of scholars . . . .
David
E. Nye is professor of comparative American studies and history at Warwick
University. His most recent book, Technology Matters: Questions to
Live With (MIT Press, 2006), explores these topics further. In 2005
the Society for the History of Technology awarded him the Leonardo da Vinci
Medal, the highest honor, for outstanding contributions to the field.
<top>
Join
the Historical Society and subscribe to Historically Speaking
Historically
Speaking: The Bulletin of the Historical Society
July/August
2006
Volume VII, Number 6
Is
War Necessary for Economic Growth?
Vernon
W. Ruttan
It
is worth recalling that knowledge acquired in making weapons played an
important role in the Industrial Revolution. James Watt turned to John
Wilkinson, a canon-borer who had invented the only machine in England that
could drill through a block of cast iron with accuracy, to bore the condensers
for his steam engines.1 In the
United States, what came to be termed the American system of manufacturing
emerged from the New England armory system of gun manufacture. In 1794
President George Washington, disturbed by the inadequate performance and
corruption of the contract system of gun procurement, proposed a bill,
which the Congress passed, to set up four public armories to manufacture
and supply arms to the U.S. Army. The Springfield Armory became an important
source of wood and metal working machines. Guns with interchangeable parts
were first developed at the Harpers Ferry Armory.2
These
are early examples of military exigencies driving technological innovation
and economic growth. Defense and defense-related institutions have played
a predominant role in the development of many of the general- purpose technologies
that shape America today . . . .
Vernon W. Ruttan is Regents Professor
Emeritus in the department of applied economics and in the department of
economics, and adjunct professor in the Hubert H. Humphrey Institute of
Public Affairs, University of Minnesota. He is the author of Is War
Necessary for Economic Growth? Military Procurement and Technology Development
(Oxford University Press, 2006). He has been elected a Fellow of the American
Academy of Arts and Sciences and to membership in the National Academy
of Sciences.
<top>
Join
the Historical Society and subscribe to Historically Speaking
Historically
Speaking: The Bulletin of the Historical Society
July/August
2006
Volume VII, Number 6
Only
Connect: The Rise and Rise (and Fall?) of Atlantic History
Trevor
Burnard
Stocks
in Atlantic history are high. “We are all Atlanticists now,” declares David
Armitage with blithe disregard for the perils of hubris. The topic has
developed the type of institutional apparatus that signals it is more than
a passing fancy. Courses on “The Atlantic World” abound; positions in Atlantic
history have been advertised at an increasing number of institutions; and
postgraduate programs for Atlantic history specialists are now appearing.
Atlantic historians gather at conferences at exotic locations around the
world; research centers with an Atlantic focus are created every year;
and funding opportunities to do Atlantic history are becoming more frequent.
Perhaps most telling, major universities, research libraries, and scholarly
organizations have begun to treat Atlantic history as a subfield, making
it possible for a cadre of historians to advance their careers, meet lots
of agreeable people who share their own predilections in interesting and
stimulating places, and network through joint participation in seminars
and fellowships.
The
Atlantic way allows budding historians a multitude of new research and
job opportunities. This is a remarkable turnabout, considering the dim
prospects facing English-speaking historians of the early modern era in
the late 1970s. For a graduate student in early American history, topics
and areas that had previously been at the cutting edge of scholarship were
now passé. The scholarship of the 1960s and 1970s led away from
a broadening vision. The work of scholars influenced by the Annales
school was extraordinary, ushering in a golden age of scholarship. But
a major failure of social history in all its multitudinous varieties was
a loss of focus. Historians concentrated so intently on the detail of small-scale
communities that, as Bernard Bailyn put it in an extremely influential
1982 jeremiad, previously “discrete and easily controllable” fields of
knowledge had become “boundless” and “incomprehensible,” the “wider boundaries”
unclear. Historians coming into graduate school from the late 1970s into
the mid-1980s were confronted by a bewildering number of studies of small-scale
communities in early America, most of which were individually excellent
but, taken together, generally led to confusion. To adapt the old joke
told about either economists or lawyers, one could lay down side by side
a host of community studies of New England towns and never come to an agreement.
Similarly,
17th-century British history was becoming ever more myopic, introspective,
and irrelevant. Indeed, scholars of the English Civil War undertook the
discipline-destroying act of claiming that the object of their study—the
English Revolution—did not really exist. It seemed as if there was nothing
interesting left to be said about either early modern Britain or colonial
America. Branching out into Atlantic history or the related New British
History was a way of escaping intellectual stupefaction. It also gave aspiring
academics an entrée into a still fiercely contested job market.
In part, Atlantic history has developed out of the relentless need for
scholarship to be about new and unexplored fields. In part, also, it has
been an understandable response by historians—as attuned to market possibilities
as any other group of professionals—to the changing market of academic
scholarship and employment.
What
I have said thus far may strike readers as unduly cynical in its emphasis
on the career-enhancing potentialities of Atlantic history. But the institutional
apparatus that has accompanied the advent of Atlantic history as one of
the more important historiographical developments of recent times did not
develop just because it met the needs of a generation of historians anxious
to be established in a dynamic new area. Atlantic history has real intellectual
clout. It has reinvigorated the histories of early America and Latin America.
Its principal theme—that the Atlantic from the 15th century to the present
was not just a physical fact but a particular zone of exchange and interchange,
circulation, and transmission—is a conceptual leap forward. True believers
in the approach argue that Atlantic history, with its emphasis on movement,
fluidity, and connections between nations, peoples, and events, shows how
the modern world was made. The idea of Atlantic history as a field of historical
inquiry that is “additive,” or more than the sum of an aggregation of several
national or regional histories, pushes historians toward both methodological
pluralism and expanded horizons. In short, my comments on the career-enhancing
possibilities of Atlantic history may be cynical, but they are not the
result of skepticism about the utility of Atlantic history as a method
or as a subject of inquiry . . . .
Trevor Burnard is professor of
American history at the University of Sussex. His most recent book is Mastery,
Tyranny, and Desire: Thomas Thistlewood and His Slaves in the Anglo-Jamaican
World (University of North Carolina Press and The Press University of
the West Indies, 2004).
<top>
Join
the Historical Society and subscribe to Historically Speaking
Historically
Speaking: The Bulletin of the Historical Society
July/August
2006
Volume VII, Number 6
“Victors
are not Judged”: Byways on Stalin's Road to Berlin
Evan
Mawdsley
"Victors are not judged” was one
of Stalin’s favorite sayings. He used these words most memorably in a post-war
speech justifying his leadership in the “Great Patriotic War.” In the good
old days of the USSR there developed a similar broad brush approach to
the history of this war, based as much on Russian nationalism as on Marxism-Leninism.
Over time this perspective has been influential outside Russia as well.
The release in Moscow of new material, however, has allowed historians
to produce more unvarnished accounts of the strategic direction of the
Soviet war effort. Especially useful have been the documents to and from
Stalin and his Stavka (General Headquarters), giving a better sense
of what was intended in particular situations.1 In
addition, the availability of Stalin’s appointments diary means that these
orders can be tied to the individuals that the Soviet leader consulted.
Also valuable has been the publication of new memoirs and diaries, and
less censored (if not uncensored) versions of earlier ones.2
Much more information about Soviet casualties has also been made public,
which allows an assessment—albeit indirect—of the scale and importance
of different operations . . . .
Evan Mawdsley in professor of
international history at the University of Glasgow. His most recent book
is Thunder in the East: The Nazi-Soviet War, 1941-1945 (Hodder Arnold,
2005).
<top>
Join
the Historical Society and subscribe to Historically Speaking
Historically
Speaking: The Bulletin of the Historical Society
July/August
2006
Volume VII, Number 6
After
the Deluge: Russian Ark and the Abuses of History*
Pamela
Kachurin and Ernest A. Zitser
The
critical and commercial success of Russian Ark (2002), Aleksandr
Sokurov’s most recent effort in historical docudrama, necessitates a thoughtful
response from anyone seriously interested in Russian history, and most
especially from American Slavic studies professionals. After all,
any movie that features cameos by Peter the Great, Catherine the Great,
Pushkin, Nicholas II, and Alexandra, and that enlists the Russophobic Marquis
de Custine as the official tour guide to 300 years of history and more
than thirty rooms of the State Hermitage Museum in St. Petersburg, simply
cries out for a discussion of its historical and ideological implications.
Our goal in this self-avowedly polemical piece is to open up a critical
discussion about Sokurov’s film, which we see as a significant milestone
in the ongoing attempt to define Russian national identity vis-à-vis
the West.
Russian
Ark is undoubtedly a technological tour de force—a single, uninterrupted
90-minute shot, featuring paintings and lavish period costumes from the
Hermitage Museum, mellifluous music by the Mariinskii Theater Orchestra,
and a cast of nearly a thousand extras. This stunning visual spectacle
can easily blind audiences to the film’s troubling political and ideological
messages. The gist of these messages is contained in the film’s title,
which suggests that after the deluge of the 20th century, it is a Russian
ark
(the Hermitage) that remains afloat on the waters of time to carry on the
mission of restoring Culture to a world chastised by the wrath of God.
Although this reading of the film may seem a bit farfetched, it is an interpretation
that echoes the bombast of Sokurov’s open letter to American audiences,
which appeared on the Landmark Theatres’ Web site soon after the release
of the film in the U.S.1 Sokurov
told Americans that
the
time has again come for people to build arks and that there must be no
delay, and that the Russians have already built their Ark, but not just
for themselves—they will take all with them, they will save all, because
neither Rembrandt, nor El Greco, nor Stasov, nor Raphael, nor Guarenghi
nor Rastrelli will allow an ark such as this to disappear or people to
die. Those that will be together with them . . . will definitely
go to heaven.
But
as the tone of the letter makes clear, Sokurov is certain that his voice
will fall on deaf ears. For according to Sokurov, it is neither Americans’
ignorance of Russian history nor their supposed youthful “wish” to lead
“world civilization” that prevents them from seeing the point of the movie.
It is, rather, “their own hardheartedness.” Sokurov believes that Americans
audiences are simply unredeemable cultural philistines. And Russian Ark
is his rod of chastisement.
Sokurov’s
controversial political agenda is built into the very structure of the
film. His use of the single, continuous shot—arguably the main reason
why American movie critics urged audiences to go see Russian Ark—is
a technological achievement that is inseparable from the movie’s ideological
content. It is Sokurov’s attempt to realize Andrei Tarkovskii’s (his teacher’s)
vision of the inherent equation between “real time” and “reel time.”2
Although Sokurov’s movie flits through three centuries in an hour and a
half of reel time, it aims to tap into and to illuminate the deeper, real
history of modern Russia. The documentary quality of the hand-held camera
presents a narrative of Russian history in which the tsars are doomed by
forces beyond their control. So, for example, the breathtaking scene in
which the ball-goers descend the Jordan Staircase of the Winter Palace
into oblivion—lined up row by row, like in some kind of Russian historical
iconostasis—masterfully evokes the tragic fate of the gloriously dressed
and doomed passengers of the Titanic. However, the sheer cinematic
beauty of this penultimate scene should not blind us to the fact that the
passengers aboard this ill-fated vessel actually had a hand in helping
the Russian ship of state go down into the maelstrom of the 20th century.
Treating them as pitiful victims, or what is worse, as martyrs in a nostalgic
and sentimentalized vision that exists only in the partisan interpretations
of nationalist mythmakers, deprives the real historical figures depicted
in the movie of the agency that they most surely possessed.
Sokurov’s
single shot manufactures continuities in a narrative that pointedly excludes
(almost) the entire Soviet period and harkens back to a nostalgic vision
of Nicholas II, the last Romanov tsar, as the saintly forgiving father—not
as the “Bloody Nicholas” whose troops fired upon unarmed demonstrators
calling for an end to an unjust war, economic exploitation, and bureaucratic
arbitrariness. In effect, the movie’s trick photography offers a powerful
justification of the controversial canonization of the “martyred” Nicholas
II, who was granted the title of “passion bearer” (the lowest rung in the
pantheon of Eastern Orthodox saints) in August 2000. And sainthood, like
visionary filmmaking, brooks no arguments.
The
ideological premise of Russian Ark is based on an unspoken, but
nevertheless quite clear juxtaposition between Western “civilization” and
Russian “culture.” This juxtaposition is played out in the running verbal
battles between the movie’s unseen narrator (Sokurov) and his guide, Marquis
Astolphe de Custine, a French Catholic aristocrat, whose international
bestseller Russia in 1839 sparked a continuing debate about Russia’s
Sonderweg. The filmmaker’s choice of museum guide reflects the movie’s
overarching concern with the issue of Russian national identity, particularly
as expressed in the evolving and frequently troubled relationship between
Russia and the West: Is Russia a part of Europe or Asia? Do the Russians
have a “national character”? And if so, can an analysis of this character
explain what has frequently been described as Russia’s imperialist policies,
its authoritarian political system, and its servile reliance on foreign
models? Following in the footsteps of such early 19th-century Catholic
thinkers as Joseph de Maistre, Alexis de Tocqueville, and Petr Chaadaev
(whose seminal “Philosophical Letters” he read in their original, French
version), Custine did not hesitate to place Russia on the other side of
the great cultural divide between European civilization and Asian barbarism.
Custine’s political expose was rediscovered during the height of the Cold
War and hailed as a prophetic work by none other than George F. Kennan,
the American diplomat responsible for formulating the policy of “containment”
of the Soviet Union.3 What is
less well known, and much more surprising, is the hold that Custine’s unflattering
description of Russia has had (and continues to have) on Russians themselves.
Indeed, Russian Ark may be seen as Sokurov’s attempt to exorcize
Custine’s ghost from the Russian national consciousness.
The
character of the effete French marquis embodies a surface brilliance that
serves to demonstrate the self-satisfied banality and decadence of a civilization
in decline. Custine’s sense of moral, aesthetic, and political superiority
is challenged at every step of his seemingly pointless stroll through the
Hermitage, the “Russian Ark” that saved the Great Masters of world art
during the deluge of the 20th century. The scene in which the Marquis is
confronted by an angry coffin maker in a war-ravaged Hermitage—an oblique
reference to the 900-day siege of Leningrad and to the heroism of those
museum workers who succeeded in saving the art from being looted by the
Nazis—is much more than a disturbing aside in a lavish costume drama. It
is, rather, a visual illustration of the nationalist trope that posits
a distinction between a nation willing to make the ultimate sacrifice in
the name of cultural treasures and universal values (dukhovnye tsennosti)
and a “civilized” country that simply lets the “Huns” take over its glittering
world capital without a struggle. As such, this scene relies on the well-worn
tactic of using the suffering inflicted upon the Russian people as a badge
of their cultural and spiritual superiority. Harking back to the arguments
of such eminent 19th-century Kulturträgers as Dostoyevsky,
Sokurov presents Russia as the Christ of modern nations and the only hope
of salvation for a materialist, bourgeois, and decadent West.
The
question of Russia’s perennial indebtedness to Western models—the main
theme of Custine’s running commentary and the leitmotiv of the film—is
particularly troublesome to a New Russian patriot such as Sokurov. What
has Russia really contributed to Western, let alone world civilization?
What makes a work of art distinctively Russian if the artist continues
to rely on Western representational norms and techniques? The movie as
a whole offers the possibility that Russia’s primary contribution to the
world of visual arts is in fact the Hermitage itself, the “Ark” that contains
the highest points of Western achievement in fine arts. And, as the chronicler
of the Ark, Sokurov partakes of that glory. Indeed, Russian Ark
may be seen as his assertion that a contemporary Russian artist can rely
on Western models (for example, such single-shot films as Alfred Hitchcock’s
Rope)
and technology (high-definition video camera and the foreign expert to
operate it) and still produce a work of striking originality. And because
Sokurov is still trapped within the confines of a Romantic aesthetic, according
to which nationalism and originality find their most sublime expression
in the figure of the national genius—whether a Pushkin or a Goethe—he rather
immodestly positions himself (and his work) as the embodiment of the genius
of contemporary Russia.
Sokurov
focuses our attention on three masterpieces of world painting, two by Rembrandt
and one by the Greek-born Spanish Baroque artist, El Greco. The latter’s
Peter
and Paul offers a fitting backdrop for a heated discussion about greatness,
religion, and Russia’s fate in the grand scheme of things. In a scene that
has puzzled many viewers, we see the Marquis scolding a young man for treating
this intense religious painting, most likely intended as a devotional object
to inspire meditation, purely as an aesthetic object. The vehemence with
which Custine browbeats the youth suggests that there is more to the scene
than a dysfunctional lesson in art appreciation. Reading between the lines
of the exchange between Custine and the youth, the modern moviegoer is
prodded to pay attention to the religious significance of the subject matter
of this painting. For these are not just any saints, but the patron saints
of both Peter the Great and his city. Sokurov suggests that Peter was not
a just another monarch, but a ruler whose creation fits into the largest
scheme possible—the story of the world’s redemption. To see Peter merely
as the despot at the beginning of the movie is to miss the redemptive significance
of what he himself described as his “paradise”—the city of St. Peter. The
monarch’s presence weighs heavily on the whole movie, even though he appears
for only a couple of minutes at the beginning of
Russian Ark. Peter
the Great is the “rock” on which the city, the building of the Hermitage,
and also the film is based, but a rock that weighs heavily around the necks
of any Russian nationalist. The prominence of the Petrine theme helps to
explain why Russian Ark had its official Russian premiere in May
2003, at the tercentenary of St. Petersburg, which celebrated what President
Putin described as “the glory of Russia, and the provenance of that glory.”
The
choice of the two works by Rembrandt—the Prodigal Son and Danaë—as
nonpareils of Western achievement picks up on both the religious and the
Petrine themes introduced with El Greco’s Peter and Paul. On the
one hand, these paintings recall Peter’s fascination with things Dutch
as a model of European cultural achievement. One only needs to remember
Peter’s work on the docks of Zaandam and his intention to build a new European
commercial port in order to comprehend the immense symbolic significance
of Dutch culture to the building of Peter’s “paradise.” It is only appropriate,
therefore, that Russia’s “Ark” (like all public buildings in the Russian
Federation) should fly a tricolor flag modeled on that of Holland and that
the modern Hermitage—that paragon of the marriage between culture and commerce—should
own and exhibit several paintings by the Dutch master. As with El
Greco’s Peter and Paul, however, the Petrine theme cannot be understood
apart from the religious issues raised by Sokurov’s vision of Russian history.
The camera’s long and languorous shots of the Prodigal Son, a painting
that embodies the idea of redemption through repentance, suggests that
the contemporary religious revival fostered by the Russian political elite
signals a more fundamental and wide-ranging return to the Lord. Anyone
who has witnessed the spectacle of President Putin’s inauguration—which
included a controversial appearance by Patriarch Alexy II, in flagrant
(if as yet only ceremonial) violation of the constitutional separation
of church and state—will see that Russia is no longer the land of the godless
communists.
In
this context, even the painting of Danaë—the story of a virgin
(Danaë) impregnated by a disembodied deity (Zeus, disguised in the
form of a golden shower), only to give birth to a heroic dragon slayer
(Perseus)—takes on a politicized meaning. Although this subject seems
far removed from assertions of Russian nationalism and Russian Orthodoxy,
the implicit references to the Virgin Mary and to St. George (and thereby,
respectively, to Christ the Redeemer and Russian statehood), suggest that
this painting was purposefully chosen to support Sokurov’s overall agenda.
Danaë
stands out for another reason: this work was attacked in 1985 by a deranged
visitor, who slashed the painting across the middle and then doused it
with acid. Conservators in the Hermitage worked for twelve years to restore
the painting, and it was placed back on exhibit only in 1997. The Herculean
efforts to restore this painting were documented in an exhibition entitled
“Danaë: The Fate of Rembrandt’s Masterpiece” and have become
part of the allure and lore of this painting. The inclusion of Danaë
seems to suggest that while the painting was a Dutch achievement, its restoration/preservation
can most certainly be touted as a Russian one. This, for Sokurov, is precisely
the kind of thankless burden that would be undertaken by the people chosen
by God Himself to preserve Western civilization.
As
Sokurov’s patronizing letter makes perfectly clear, the movie’s historical
mystifications, while aesthetically stunning and technologically innovative,
were never intended to further the cause of promoting education about Russia.
In that sense, the mutually contradictory opinions of Custine and Sokurov
are actually two sides of the same coin. Indeed, for all of its technological
razzle-dazzle, Russian Ark contributes as much to the ignorance
about Russia as the “Potemkin villages” that Custine encountered at the
court of Nicholas I. By representing Russia as a land that, in the words
of the 19th-century poet Fedor Tiutchev, “cannot be grasped intellectually
or measured by a common standard . . . but must simply be believed in,”
Russian
Ark only further propagates the cycle of mythmaking and fear that has
characterized relations between Russia and the West. At the end of this
cinematic spectacle, we are still left with the same old dichotomies between
East and West, modern and backward, civilized and barbarous. Far from offering
salvation to the world, Russian Ark turns out to carry the same
load of Russian nationalism that helped to unleash the flood of the 20th
century.
*A longer version of this essay originally
appeared in NewsNet: News of the American Association for the Advancement
of Slavic Studies 43, no. 4 (2003), 17-22.
Ernest A. Zitser <zitser@fas.harvard.edu>
is a research associate at the Davis Center for Russian and Eurasian Studies.
He is the author of The Transfigured Kingdom: Sacred Parody and Charismatic
Authority at the Court of Peter the Great (Cornell University Press,
2004).
Pamela Kachurin <kachurin@fas.harvard.edu>
is a research associate at the Davis Center for Russian and Eurasian Studies
and co-founder of the Society of Historians of East European and Russian
Art (SHERA). She is the author of numerous publications on Russia
and Soviet art.
1 For
the text of Sokurov’s open letter to the American public, see “Sailing
Russian
Ark to the New World” <www.landmarktheatres.com/Stories/ark_frame.html>.
2 Jane
Knox-Voina (Bowdoin College) explained Sokurov’s indebtedness to Tarkovskii
at a roundtable discussion of Russian Ark sponsored by the Davis Center
for Russian and Eurasian Studies, Harvard University, May 2, 2003.
3 George
F. Kennan, The Marquis de Custine and his Russia in 1839 (Princeton
University Press, 1971).
<top>
Join
the Historical Society and subscribe to Historically Speaking
Historically
Speaking: The Bulletin of the Historical Society
July/August
2006
Volume VII, Number 6
Conjectures
of Order: A Review Essay*
Bertram
Wyatt-Brown
Michael
O’Brien’s Conjectures of Order stands in bold contrast to a longstanding
denigration of Southernness, intellectual and otherwise. After the Civil
War, members of the northern intelligentsia found their southern brethren
hopelessly backward, parochial, and dull-witted. In The Education of
Henry Adams the autobiographer observed that antebellum Southerners
had been “stupendously ignorant of the world.” Adams characterized the
lords of cotton as “mentally one-sided, ill-balanced, and provincial to
a degree rarely known.” In fact, the Southerner, Adams stated unreservedly,
“had no mind; he had temperament.”1 Likewise,
Henry James blasted the South’s pursuit of a false “Confederate dream”
that “meant the eternal bowdlerization of books and journals” and placed
“all literature and all art on an expurgatory index.”2
Referring
to the South of his day, H. L. Mencken in 1917 opened an influential essay
by quoting the poet J. Gordon Cougler: “Alas, for the South! Her
books have grown fewer–/She never was much given to literature.” Mencken
added, “It is, indeed, amazing to contemplate so vast a vacuity . . . .
Nearly the whole of Europe could be lost in that stupendous region of worn-out
farms, shoddy cities, and paralyzed cerebrums.”3
As
late as the 1930s, Southerners themselves deplored the seeming absence
of cultural giants despite the appearance of William Faulkner and many
others. The North Carolina journalist W. J. Cash traced the problem to
the ways of the Old South. A “savage ideal” of white superiority and brutish
brawn over intellect helped to defend slavery.4 The
late C. Vann Woodward, who became the leading historian of the South, recalled
that he was not alone in “parroting metropolitan wisdom” that the “critical
moguls” dispensed from “the Hudson.” To the young Woodward, even Faulkner
appeared to draw “his subjects out of abandoned wells.”5
The
poet Allen Tate censured antebellum Southerners who “knew no history for
the sake of knowing it,” leaving an inadequate legacy for successors to
build upon.6 Still more recently,
the Southern novelist Elizabeth Spencer wrote that during the 19th century
“the South was a dormant land.” It was “‘backward’—poorly schooled, poorly
fed, a crippled land.”7
In
Conjectures
of Order O’Brien defies all these critics. According to O’Brien, the
South, far from being a listless backwater, contributed to national culture
far more than has ever been recognized. To overturn those decades of neglect
and even mockery, the English don at Jesus College, Cambridge University,
establishes an impressive case. Broader in scope than all previous works,
his imaginative, elegant, and original text inspires the scholar’s awe.
How did he manage to read all that long-forgotten material?
Conjectures
covers literature, theology, history, political theory, science, and
ethnography. It is on the basis of their erudition and breadth that the
two volumes have won the Bancroft prize, the Merle Curti award, and the
Frank Owsley prize, along with being short-listed for the Pulitzer.
Despite
its many strengths, O’Brien’s approach leaves some significant issues unexamined.
If the South were highly endowed with intellectual rigor and forcefulness,
why was its record of achievement so long unrecognized? The opinions of
Adams, James, and the rest cannot be casually dismissed. It is no longer
fashionable to set up criteria of literary supremacy in the hierarchical
manner of F. R. Leavis's The Great Tradition. Yet O’Brien makes
no case for the enduring merits of these antebellum writers. Herman Melville,
Nathaniel Hawthorne, and the others of the New England antebellum Renaissance
left a legacy from which later American authors drew inspiration. The South’s
conservative intelligentsia feared so greatly the future that few provided
that foundation for their successors. Slaveholding had much to do with
that failure, just as Clement Eaton and others pointed out long ago .
. . .
*Michael O’Brien, Conjectures
of Order: Intellectual Life and the American South, 1810-1860, 2 vols.
(University of North Carolina Press, 2004).
Bertram Wyatt-Brown is Richard
J. Milbauer Emeritus Professor of History, University of Florida, and Visiting
Scholar, Johns Hopkins University. He has served as president of
the Society for Historians of the Early American Republic (1994), the St.
George Tucker Society (1998-99), and the Southern Historical Association
(2000-01). His most recent book is Hearts of Darkness: Wellsprings
of a Southern Literary Tradition (Louisiana State University Press,
2003).
<top>
Join
the Historical Society and subscribe to Historically Speaking
Historically
Speaking: The Bulletin of the Historical Society
July/August
2006
Volume VII, Number 6
Historians
in the Midst of Catastrophe: Reflections of the University of New Orleans’s
Department
of History after Hurricane Katrina
The
fall semester was only two weeks old at the University of New Orleans when
Katrina bore down on the Gulf Coast. As the UNO history faculty evacuated,
most took a few books and lecture notes along, believing they would be
back in the classroom in a few days. In fact, it would be weeks before
they would be allowed into the devastated city and months before they would
be able to access their offices. Scattered across the country in hotels
and shelters, or housed with family, friends, or strangers, faculty members
were torn from their colleagues and the university. The technology of the
modern age—cell phones, servers, e-mail addresses—had collapsed along with
their campus. Research materials were endangered. Homes were destroyed
or inaccessible. The campus became temporary shelter for perhaps 2,000
storm victims and sustained over $100 million in damage.
Yet
only six weeks after Katrina, the University of New Orleans reopened for
its fall semester, the first and only university in New Orleans to do so.
Over 7,000 UNO students attended lecture courses at satellite campuses
in New Orleans suburbs or participated in online courses from locations
across America and overseas. In December and January, many faculty members
taught intensive intersession courses. On January 20, 2006, the University
of New Orleans, against all odds, held its fall graduation.
The
much reduced history faculty returned to the reopened but still damaged
main campus for the spring semester. Two senior members who had planned
to retire in May 2006 opted to leave in December. A job search was suspended.
FEMA trailers intended to house homeless faculty and staff did not become
available until April, and UNO is currently in a state of financial exigency
that will mean termination of faculty and programs throughout the university.
Although the campus remained dry for the most part, it is surrounded by
some of New Orleans’s most devastated neighborhoods. Huge cranes and pile
drivers work unceasingly along the London Avenue canal that runs alongside
campus. There’s barely a functioning business for miles, and reminders
of the devastation of the storm and flood are everywhere. On a positive
note, in January distinguished military historian Allan Millett, recently
retired from Ohio State University, joined the faculty as Director of the
Eisenhower Center, founded by our late colleague, Stephen Ambrose. It was
he who first suggested that the history department should tell its story.
Below
are thoughts and reflections of members of the University of New Orleans
history department.
Ida
Altman, research professor and department chair, fall 2005, had accepted
a position at University of Florida in spring 2005 but stayed to serve
her year as chair. After losing her home to Katrina, she drove over 300
miles twice weekly to teach her classes.
Recalling
everything that has happened since the storm and flood is like summoning
up a dream: some episodes stand out clearly while others are barely retrievable.
Early on I had a conversation with a cousin who suggested (insistently,
it seemed) that having lost my home and all belongings, I now “knew what
was really important.” It irritated me at the time—I didn’t know exactly
how I felt but certainly didn’t want someone else telling me how I should.
In retrospect it seems even more off the mark. Of course I was thankful
my husband and I had escaped harm, as had our friends; of course I was
grateful for the love and support of family. But in fact it’s all important—chatting
with my neighbors, early morning walks with my dog, seeing friends and
colleagues at the gym, dinner at a favorite restaurant, zydeco music at
Mid City Lanes; they all made life rich and familiar. Losses are
not confined to what one can list on insurance claims.
In
a larger sense, and especially in historical terms, it all does count.
Since the storm I’ve lived in Mobile, which suffered relatively little
damage; my husband soon resumed teaching at the University of South Alabama.
Bereft and dazed, I offered to lecture to his class on the conquest of
western Mexico. The lecture had peculiar resonance; the pivotal episode
of that conquest was an immense storm and flood that engulfed the sprawling
encampment of Spaniards and their Indian troops and auxiliaries along the
banks of a river in September 1530.
The
leader of the campaign, Nuño de Guzmán, was perhaps the most
notorious of the Spanish conquerors of Mexico. Shrewd, ambitious, and callous,
for a decade he exercised considerable power in New Spain. Yet, in the
greatest
test of his leadership abilities, following the flood he failed utterly,
making decisions that compounded the misery and mortality suffered by his
native allies. Just as some friends and I one evening speculated almost
tearfully about how things might have unfolded had Edwin Edwards been governor
at the time of Katrina, I wonder what the adroit Cortés might have
done in the same circumstances. Yet Guzmán’s failure of leadership
was only one of several factors that resulted in catastrophe.
When
UNO resumed classes in October I taught four graduate students in my introductory
course. We agreed that the history of Katrina and its aftermath must take
into account politics and political leadership, or lack of such; weather,
hydrology, geography; engineering; demographics; and the distinctive society
and culture that shaped and reflected New Orleans’s neighborhoods and people.
Over the years I’ve sometimes questioned the social value of what we do
as historians. More than ever I feel that my students want to understand
the “why” of history—not just of what happened to them but what has occurred
in other times and places as well. Our job is to help them make sense of
the past’s sad mysteries.
Günter
Bischof, diplomatic historian, department chair 2006 and director of Center
Austria, first came to UNO from Austria as an exchange student. After receiving
his Ph.D. from Harvard, he returned to make Louisiana his home.
I
evacuated for the first time from an approaching hurricane. Being married
to a fearless Cajun who considers herself a seasoned storm-survivalist,
we had never evacuated before. This time our frightened 14-year-old daughter
quasi-forced us to leave. We went to Hot Springs, Arkansas (the first place
we could find a hotel room), and took to the waters of this historic spa
while the storm raged on the Gulf Coast. With hindsight it strikes me as
bizarre irony that I was sitting in a pool of hot water in Hot Springs
at the time when the canal walls broke and began filling the bowl that
is New Orleans. We returned to our house in Larose on Bayou Lafourche (fifty
miles southwest of New Orleans) the day after the storm. Our house was
okay. We had lots of wind damage in the yard, but no water damage. We were
without electricity for almost a week. The only source of news was the
radio.
In
the days after the storm I gave many interviews to Austrian newspapers
and started to get in touch with people, including my chair, Dr. Altman.
I also helped get some forty Austrian students, who had begun a year of
studies at UNO the week before Katrina hit, into new host institutions
all around the country (only four returned to their native Austria after
the storm). American universities from San Diego State to Miami were magnificently
generous in “adopting” these foreign students for a semester with tuition
waivers. All of them are completing the year at their new institutions,
as the housing shortage in New Orleans did not allow them to return to
UNO. American civil society rose to the occasion, while the government
in Washington dawdled. At the same time our partner university in Innsbruck,
Austria, and other European universities granted refuge and were accepting
UNO and New Orleans area students free of charge for a semester of study.
Mutual transatlantic solidarity and higher education networking shone bright
at a time of political tensions with the war in Iraq.
Since
I still had a roof over my head and was back home, I began commuting and
teaching two classes at Louisiana State University in Baton Rouge ten days
after Katrina hit. My U.S. History Survey II was a “refugee” section with
student evacuees from all the New Orleans institutions (UNO, Tulane, Loyola,
Dillard, Xavier, Delgado, and Holy Cross). Students missed classes as they
tried to deal with lost property, insurance adjusters, and FEMA. Perceptibly
dejected, they nonetheless were intent on continuing their education in
hopes of restoring some structure to their shattered lives.
Teaching
and writing did the same for me. Applying myself to what I like to do best
provided relief and prevented depression, especially once I saw storm-ravaged
New Orleans in early October. I sent regular reports to Austrian newspapers
and penned ruminations on what I saw in the city for friends around the
country and abroad—some of which were published on HNN (kudos to Dr. Rick
Shenkman!). Writing was a form of therapy. On October 10, on my commute
to Baton Rouge, I heard a report on NPR that UNO had reopened in a satellite
campus in Jefferson Parish. It made me cry. Not only would I be able to
hang on to my tenured faculty position (for the time being?), but I would
also be involved in rebuilding our university, which has served many poor
and underprivileged students from the greater New Orleans area for two
generations. Continuing their education would provide them a means of returning
and contributing to the Crescent City.
When
I entered the UNO main campus for the first time in early October I was
not sure whether my books and lecture notes had survived the flooding on
the south side of the lakefront campus. I was lucky again—they were still
there. The office smelled a little musty but otherwise looked untouched.
While some colleagues sadly had lost their houses and their books, my personal
and professional life was still intact. Go figure. Now I inch toward better
comprehension of why survivors of major catastrophes wrangle with a bad
conscience.
Assistant
professor James Mokhiber teaches African and world history, and was the
first to return to the city after the flood.
Behind
the wheel of a spattered minivan, I struggled to make my way to campus
in mid-September. My usual route followed the bayou up to the lake, through
a parade of the city’s famous live oaks. I’d turn at the Greek church,
pass the brick homes of the university district, and cross a small bridge
over an unmarked canal. I suppose it could have been a pleasant ten-minute
commute, but as an assistant professor with new courses to teach I usually
had to make it in seven or eight.
It
was slower going in the van after the storm. As I wound my way through
the downed trees and power lines, I found myself dwelling on the scenes
I had seen and imagined during our two-week evacuation. In my mind I saw
the department’s halls striped with the breached canal’s muck; inside my
office, I could imagine my bookshelves floating for a moment, before tipping
my stashed crates of research into the floodwaters.
These
images did not really begin to fade until I pulled up beside our building.
While the low brick buildings across the street were ruined, the waters
had just lapped at the glass doors of our breezeway. The department’s halls
were maybe a little dank. I opened the door to my office and shined my
flashlight on my stowed research. I still may have an academic career,
I thought selfishly. A friend helped fill the back of the filthy van with
my folders. Let’s hurry, he quipped, we don’t want to be the first people
in the city to be shot for looting academic research. That afternoon, I
shared the campus news with my colleagues in the emerging diaspora. The
message went out amid reports of missing colleagues and more. Already my
elation had faded. My priceless folders ended up in the corner of a garage.
Over
the next several weeks I traveled the city, shelters and outlying parishes
with a British television crew, and grew more dismayed about the extent
of the damage. At the same time, the university administration mounted
ambitious plans to restart the semester online, and I found myself almost
openly rebellious. The idea of teaching a course on African history, in
the midst of everything, seemed preposterous. Later experience has shown
that, from a practical point of view, this was a wiser course of action
than I had let myself understand at the moment. As our enrollment numbers
decline, and the first waves of firings and retirements begin, it is clear
that we—and the city—have not yet heard the last of Katrina’s effects.
I think
about this a lot when I drive home after class now. From the small bridge
over the canal, I can see the workers reinforcing the broken levees.
I watch the debris piles come and go outside the empty homes, and wonder
how many people will attend the summer festival at the quiet Greek church.
I slow down when I turn onto the shady tree-lined boulevard by the bayou.
I try to forget about the arborist I heard recently on the radio, saying
that many of these leafy great oaks are in peril, and may even be dead.
They just don’t know it yet, he said.
Andrew
Goss, assistant professor of history, is the department specialist in the
history of Asia and Indonesia and has been with the department for two
years.
Five
weeks after we evacuated, my wife and I returned to our house in uptown
New Orleans. We found it largely as we left it, with minor
wind damage and no flooding. After pulling down the hurricane boards, we
opened the windows and peeked into the refrigerator. It had been emptied
of all the meat prior to the evacuation,
and we were able to salvage it with bleach and baking soda. We had been
very fortunate and were full of hope. As members of the first round of
regular residents back in the city, we felt like pioneers in a frontier
town, capable of exerting our will on its future. After a month of inactivity
in exile, glued to the TV and the Internet, we were euphoric at the prospects
for action. With a safe haven to return to every night, the city looked
more like opportunity than ruin.
The
University of New Orleans reopened a week later, and in addition to reviving
the first part of the Asian history survey I had started five weeks earlier,
I began teaching a new history of science survey course online. This might
have been a daunting task, with no books or lecture notes at home, and
without electricity or a phone line. But in fact the experience freed me
from the usual pressures facing a new faculty member. Professionally I
was on my own, without anyone watching over my shoulder. In those early
weeks there were still few rules, and we all shared a collegial sense of
a shared mission. I gained access to my office before the campus was closed
down for mold remediation, and the manager of the streetcar barn across
the street offered us an energized extension cord during the first critical
week of classes. The rewards were immediate. My students, both online and
at the satellite location, were grateful for the return to class, and were
eager to turn their minds towards Aristotle and the Ramayana. All four
of my Asian history students had perfect attendance, arriving twice-a-week—at
7:45 am!—for nine weeks straight. It was the kind of course I had always
dreamt of teaching.
Reflecting
back now on the past six months, I see how Katrina led to the maturation
of my historical consciousness. Questions about the political and social
causes of the man-made part of the disaster caused me to shed the narrow
intellectual worldview I had inhabited in graduate school for an outlook
in which historians answer questions important to their community. Even
before returning home, I had begun working with my colleague Connie Atkinson
on a grant proposal examining the historical parallels between floodworks
technology in the Netherlands and Louisiana. Even with no access to research
materials, it was some of the most natural writing I had ever done. It
was so obvious that in the effort to rebuild south Louisiana, historians
would play a critical role, and that I, too, could do my part.
Much
of the euphoria has worn off today. Classes have resumed on campus, and
so have the usual challenges of grading, worrying about publications, and
dealing with plagiarism. And the reality of
considerable cutbacks at the university, caused by lower student enrollments
and diminished state funding, has finally set in. But for me, Katrina has
opened up my work, granting a kind of freedom to pursue less narrowly defined
intellectual agendas.
Joe
Louis Caldwell, associate professor of history and former department chair,
is a native of north Louisiana and specialist in post-Civil War U.S. history.
On Saturday night, August 27, and
again on Sunday morning, August 28, 2005, I listened to Mayor Ray Nagin
and other city and state officials as they pontificated about the likely
path of Hurricane Katrina. By early Sunday morning my wife, my daughter,
and I had decided to evacuate our homes and our uptown neighborhood. As
I left, I was stopped by a neighbor who asked me to try to persuade her
husband to leave New Orleans. I did so and was rebuffed. The 85-year-old
Reverend Montgomery informed me that he believed emphatically that the
Lord would take care of him. At that point, I excused myself; he promised
to pray for me, and I promised to do the same for him. My family and I
left for Opelousas, Louisiana, the home of one of my wife’s brothers. In
the evacuation traffic, it took us nine hours to make the normally two-hour
trip. After staying with my brother-in-law for a month, we rented a trailer
and remained in Opelousas for another five months. I came back to New Orleans
briefly in September, when persons living in our zip code were allowed
to come in to the city to examine their homes. We returned permanently
in February 2006. Now we live in a FEMA trailer situated in the driveway
of our damaged home.
I’m haunted by the images of that
September visit. I arrived in the early morning hours. There were no streetlights.
A misty pall hung over the city. As I drove down Claiborne Avenue, I saw
a black Cadillac hearse upended on the neutral ground, leaning against
a palm tree, reminiscent of something from a movie set. Arriving at my
house I found the usually lush green neighborhood a dusty brown; there
were no dogs, cats, pigeons, or “hoot” owls. The silence was deafening.
The dispersal of my uptown neighborhood
reflects the larger picture of storm-ravaged New Orleans. My recalcitrant
elderly neighbor, mentioned earlier, who chose to remain and ride out the
storm, did survive but water rose over ten feet inside his two-story home.
He retreated to the second floor where he was forced to spend a terrifying
night sleeping atop a dresser. Rev. Montgomery, a veteran of World War
II, had finally left his home when the National Guard came to his door—he
said he could not refuse men in the uniform of the United States Army with
whom he had served in the Pacific Theater. His stepson, Byron J. Stewart,
who lives around the corner, evacuated his family to Alexandria, Louisiana.
Joseph Victor, a retired truck driver, who lived a block away, evacuated
with his family to Baton Rouge, Louisiana. Chalmous Smith, another neighbor,
evacuated to Jackson, Mississippi. Charles Wilson, a neighbor employed
at a local hotel, fled to the Convention Center and after two days there
made his way to the Superdome and from there was evacuated to San Antonio,
Texas. Most people in my neighborhood with the wherewithal to leave did
so. Like our neighbors, when my family and I left before the storm, we
packed for a weekend away from home. It was a very long weekend.
Arnold R. Hirsch, research professor,
Ethel and Herman Midlo Chair in New Orleans Studies, and director of the
Midlo Center for New Orleans Studies, is president-elect of The Urban History
Association.
We were reluctant to leave our home,
which was situated on a piece of (relatively) high ground in the Carrollton-University
area uptown. We had listened to the dire warnings that attended the coming
of Hurricane Ivan the year before, and evacuated for the first time during
the quarter century we had lived in New Orleans. It was an absolute fiasco.
Poorly planned and executed, our trip, undertaken with all the help and
assistance the state could muster, got us to Lafayette in twelve hours—an
excursion that normally took two or three. Saved by a late change in the
hurricane’s course, we returned home, vowing never to be stampeded out
of our home again. My wife proved particularly adamant on that point, as
a chronic illness and prescription drugs precluded my providing any relief
for her as a driver.
Our bravado disappeared the next
year with Katrina’s approach. We packed quickly and lightly—we were sure
we would be home in a few days—and left our larger suitcases out for the
next weekend’s planned flight to Chicago where my nephew would be getting
married. Aside from some T-shirts and jeans, I grabbed only my laptop on
the off chance I would get to do some work over the next few days. The
tuxedo would have to wait. We left in the predawn hours on Saturday, heading
for our closest complementary accommodations: a cousin’s home in Dallas.
From there, I monitored Katrina’s approach, staying up through the night
as it made landfall, listening to the same reports over and over again
as it moved inland. I slept a couple of hours once it appeared the worst
had passed.
I awoke to begin planning our return
only to learn of the collapse of the Industrial Canal and 17th Street levees.
Exhausting the selections made possible by cable television, I went to
the Internet for more information and then back to cable again. My
wife and I watched in horror as our city disappeared beneath Lake Pontchartrain’s
unrelenting waves. Knowing only that we could not return home, we spent
the rest of the week in Dallas and made ready for what seemed our inevitable
trek to Chicago. We had family and a place to stay there. That trip proved
uneventfully sad, though it was punctuated by the unfailing kindness of
strangers who after learning we were in flight from New Orleans offered
food and refreshment, words of solace and encouragement, and even clothing.
Curiously, the only tension that I can recall resulting from our presence
came not from the inherent awkwardness of receiving such aid, but from
the testiness of soon-to-be relations who seemed gravely put out that we
did not stop at a tuxedo rental on the way to the wedding.
Once in Chicago we quickly fell into
a routine, albeit an unusual one. First came the constant monitoring of
events and conditions in New Orleans. The scenes in the Superdome and Convention
Center shook us badly, as did many reports in the national media. A large
fire on Carrollton about a mile from our home went uncontrolled far too
long and gave us some concern. We quickly found both local and extraterrestrial
sources (satellite pictures) on the Internet and mined those to supplement
the more conventional reports.
The passage of each day brought us
into closer contact with those who returned almost immediately and a few
who had never left at all. We were most appreciative for the bits of news
gleaned this way (particularly eyewitness accounts that eventually informed
us that our house sustained only minor damage and had not been flooded).
We remained apprehensive, however, about the safety of friends who remained
in harm’s way. Soon, my wife took charge of the process that would enable
us to return home, contacting (and contracting) electricians, plumbers,
roofers, painters, landscapers, appliance repair services, and inspectors
of various sorts. Daunting tasks when taken individually, collectively
these chores drew on reserves of patience and persistence that I could
not have previously imagined.
For my part, I fell in among old
friends and institutions in Chicago that provided a seemingly endless array
of activities and opportunities. Professionally, I had more than I could
handle. What struck me most was how easily I fell into a network left behind
nearly three decades before. I picked up with friends from undergraduate
days who made it seem as though I had been away for no more than an extended
summer break. Graduate school friends made me a T.A. once more, and later,
local professional contacts kept me engaged in my current work. It was
heady stuff and all, it seemed, too good to be true. Moreover, enough of
my family remained intact to provide the delicate balance of joy and angst
(tsuris for purists) that let loose another flood—this one of memories.
And then there was the city itself. Birthplace, but no longer home, its
allure, I understood, could be found in the very unreality of my situation.
By mid-November we anxiously, but eagerly, headed south to reclaim both
house and home.
Catherine Candy, assistant professor
of history, the department’s specialist in the British Empire and the history
of Ireland and modern India, has lived in New Orleans for two years.
What I will remember about the UNO
history department in the wake of Katrina is the startling alacrity with
which, with the university servers down and e-mail addresses gone, colleagues
created a Yahoo department listserv within three days and had it up and
running from a hotel room in Houston, although it took many weeks of detective
work to trace everybody. I will also remember the incredible kindness of
colleagues on my return to the city.
When it became clear that we would
not be going back to New Orleans for a while, I evacuated to family in
Ireland and was persuaded to talk about the hurricane in my old primary
(grade) school where I was interrogated so tenaciously by the
children about the fate of the children of Katrina that I wondered if anyone
had studied transnational children’s solidarity. On taking refuge at my
alma mater, Irish historians at once remarked on the comparisons between
the discourse on the famine in Ireland and that of Katrina. As the UNO
faculty cobbled together courses for our widely scattered students in October,
from Ireland I created a hastily pulled together online course that attempted
a comparison of the two catastrophes in terms of roles of the state, class,
race, gender, empire, evacuation/emigration, death and the “horror.” Then,
by early October, the news came of the tragedy in Pakistan and India, so
that tracing the common global history of the shaping of all three catastrophes
and aftermaths looms as the next teaching challenge.
Michael Mizell-Nelson, assistant
professor of history and specialist in U.S. labor and race relations, is
a long-time resident of New Orleans.
From the air as I approached New
Orleans, I saw the great number of blue tarps covering rooftops throughout
the city, so I wondered just how many of our things might have survived
on the second floor. We lived across the street from the 17th Street canal
levee breech, so I already knew that the first floor was a complete loss.
The muck that earlier covered our
neighborhood had dried into dust laden with heavy metals and kicked around
by the wind. Markings on the kitchen door revealed that our house had been
searched on September 25 (almost four weeks following the disaster). Another
mark near an upstairs window indicated that someone in a boat had checked
our house for survivors before the ten feet of water had drained.
We were lucky in that there were
no broken windows and no roof damage; also, no neighbors seeking higher
ground had used our place as refuge. Everything in the upstairs rooms remained
dry but smelled of sewerage and mold. The mold had sprouted along the walls,
but stopped one step beneath the second floor landing. The second floor
looked just as we had left it. Our daughter Keely’s Brownie vest lay on
the floor in the middle of her bedroom—exactly where she had left it. Except
for the muddy footprints of the search-and-rescue people, the top floor
had survived—and so had my research materials. Since we rented, I needed
only to salvage our possessions and leave our former neighbors behind.
We were most fortunate in having
a second floor; many of our neighbors lived in single-story homes, where
little could be saved. The sight of one neighbor’s collection of family
photographs—three decades of family history—strewn about the lawn in an
attempt to salvage some of the images broke our hearts.
Parents can’t help but view Katrina
through the eyes of children—their own and others. The kids’
artwork that filled our walls was destroyed. I can accept losing all of
the art, except for one piece our daughter made as a five-year-old. It’s
a self-portrait that included an Escher-like touch: her hands drawing her
portrait. I found no trace of that portrait. I have some regrets about
losing the hundreds of record albums and books, but I would do anything
to reclaim her picture.
Despite our children’s losses, they
are also relatively fortunate. At least one child in our daughter’s class
is experiencing a post-Katrina divorce. Several classmates face temporarily
disrupted homes as parents live apart in order to preserve jobs that moved
away from the city. Our family endured no such emotional or economic traumas.
If ever I begin to consider our family as unfortunate, I need only glance
at the three plastic tubs of our photos and videotapes, intact and safe,
and think of our neighbors and their family photographs, water-soaked and
mildewed, scattered like debris across their lawn.
Raphael Cassimere, Jr., Seraphia
D. Leyda Professor of History and specialist in African-American and U.S.
Constitutional history, is a native New Orleanian and alumni of UNO.
Like so many, my wife and I evacuated
New Orleans. As we returned to the area along the I-10 through Jefferson
Parish, west of the city, the damage seemed moderate, but as we entered
the city the specter of death, decay,
and desolation was everywhere. The shock of seeing our Bywater neighborhood
was tempered by the welcomed sight of a few neighbors, but we learned that
an elderly neighbor had died inside his home. Before going inside our house,
we said a brief prayer. Although the damage was worse than we hoped, we
still felt blessed, especially later as we drove to the university. En
route the scenery was indescribable. Destruction was everywhere. If Katrina
was an “act of God,” then as St. Peter observed long ago, “God is no respecter
of persons.” Katrina was an indiscriminate destroyer. My thoughts turned
to the awesome task of rebuilding.
James Russell Lowell wrote, “Once
to every man and nation comes the moment to decide.” I believe New Orleans
has been given a rare second chance to get it right. After the devastation
caused by the Civil War, New Orleans lost an opportunity to build an inclusive
society that treated all as equals before the law, without regard to race,
color, or ethnicity. Bigotry and bitterness aborted the dream, and it failed.
Now, we can start over again. But where do we begin? Economic and social
reform, an updated building code, 21st-century levee protection, moral
reform? While these are important, the foundation on which all else is
built must be an excellent public school system, second to none, regional
or national.
We should not clone the old system.
At best, it would replicate mediocrity. The best and the brightest should
be recruited who must represent the racial and ethnic entirety of our community
and be held to the highest standards. Expectations should be very high
and monitored continually. Those we entrust to manage our public schools
must be persons of ability and integrity. We can no longer allow greedy
self-interest groups to select policy makers and administrators solely
on the basis of patronage. We must not tolerate cronyism or nepotism. Our
children must learn to live harmoniously within an ethnically, racially,
and socially diverse community.
We have a chance, perhaps our last,
to create a school system that can be a model for the state, indeed, the
nation. Instead of just catching up, we can lead the way. We must not retreat
to our divisive past. Rather, we must be bold as we venture into the uncertain
future.
Connie Zeanah Atkinson, assistant
professor of history, associate director of the Midlo Center for New Orleans
Studies and a specialist on New Orleans music and U.S cultural history,
was a music journalist in New Orleans for many years.
Again and again, we are asked by
friends, family, and generous strangers, “What do you need?” What we don’t
need is stuff. And where, in our FEMA trailers, would we put stuff anyway?
On the other hand, we need so much. Our students need textbooks, and a
place to live. Our department needs research money, money to hire new people,
money for graduate assistantships. Our neighborhoods need everything.
What do I need? Well, I need to get
over the shame of needing. I have learned that it is more difficult to
receive than to give. I need to know what is going to happen with my university,
my neighborhood, my city. I need to know how to shape the experiences that
I have had and use them in teaching. I need to understand what role a historian
of the city can play as events are happening all around us. And I need
to figure out what to do about the anger—anger for the harm done our beautiful
city, anger when she (yes, New Orleans is always a “she”) is mistreated,
disrespected, and misused. The anger flashes when I hear discussions of
not rebuilding her; of the plans created by policy makers without
consideration of her special architecture, neighborhoods, and culture;
of the nation’s money being poured into corporate contracts that never
make it to the city’s people. My anger flared up again when the Southern
Historical Association moved their 2006 annual meeting away from New Orleans
to Birmingham, Alabama. This very public rebuff was especially irksome.
If New Orleans is of no interest to historians of the South after the worst
natural disaster in the nation’s history, when will she be of interest?
In their letter announcing the move, officers of the Southern sent us their
“prayers and best wishes”—and yes, we need those. Their conference we needed
even more.
Mary Niall Mitchell, assistant
professor of history and specialist in 19th-century southern and U.S. cultural
history, was the recipient of the 2004-2005 Oscar Handlin fellowship from
the American Council of Learned Societies.
I am not doing a Katrina project.
This is not to say that Katrina projects, carefully planned and
considered, are not worthwhile, particularly those that involve engineering
for our unreliable levees and psychological counseling for our devastated
population. It is to say that I am not at all certain of what the historian’s
role should be in the aftermath of such a disaster. While still sitting
at my mother’s desk in Florida in September (or was it October?) I received
e-mails from university officials encouraging me to develop a project,
write a grant, teach a class related to Katrina. One such missive even
suggested that a collection of narratives from Katrina survivors would
make a great book. The water had only just receded, I thought. My house,
untouched by flooding, did not have electricity. My answering machine would
not pick up. I was still looking for my neighborhood, the Irish Channel,
on satellite maps. Mice and coffin flies were having a house party in my
kitchen. Everyone I knew from New Orleans was in a state of shock. Nobody
knew what would become of the city, much less the university, which was
surrounded by floodwater. Nothing in my experience up to that point suggested
a “project” for me, a historian.
Since then, eloquent, meaningful,
and necessary work by journalists, songwriters, and artists has
appeared. It is all in the interest of healing, and if ever a population
needed a deep dose of catharsis, it is the people of New Orleans after
Katrina. Some of my colleagues at UNO and other schools are gamely instructing
students in the methods of oral history so that they can collect the experiences
of survivors. This, too, is important work. But I worry that in emphasizing
the importance of “Katrina projects” in the immediate fallout of this disaster,
we are missing the opportunity to deliver insight more profound than the
idea that catastrophes make compelling reading. Having lived through such
a disaster, and being daily disoriented, disheartened, and surprised by
it, we have gained a perspective on history that most people never fully
acquire. We are acutely aware that we do not know what is going to happen
next. We are in the middle of the history that will someday be written,
not because it will sell books but because, with perspective, reflection,
and research, we will learn from it. It is this inability to predict the
future, not the desire to seize on the recent past, which should shape
our thinking. As students of history, we should take from Katrina a renewed
appreciation for what those in the past endured, and for the difficult,
often wrongheaded choices they made despite and because of the uncertainty
they faced.
<top>
Join
the Historical Society and subscribe to Historically Speaking