Joseph
S. Lucas and Donald A. Yerxa, Editors
Popular
and Professional History | Decline of Popular History?
| Confessions of a Social Scientist
Asking
Too Much of History? | Nineteenth-Century America
| Democracy versus the Other | Jeffrey
Burton Russell |
Late
Antiquity and Byzantium |
Dispatch
from the United Kingdom |
Access
to Presidential Papers | Letters
April
2002
Volume
III, Number 4
Popular
and Professional History
by
John Lukacs
The
Path Between the Seas, the hitherto best history of the building of
the Panama Canal, written by David McCullough, was published in 1977. In
a widely adopted and best-selling American history textbook by two Harvard
professors Freidel and Brinkley, America in the Twentieth Century
(1982), the bibliography lists McCullough’s book with these words: “A lucid
popular history of the building of the canal.”
“Popular
history?” What kind of nonsense this is. The Path Between the Seas
was not only well written; McCullough’s research, reading, and scholarship
were largely faultless. This volume represented—indeed, demonstrated—just
about all of the qualifications and the desiderata of a historian, and
then some—indeed, of a professional as well as of a popular historian,
by vocation; though not an academic historian, by affiliation.
Modern
historical consciousness evolved in Western Europe during the 16th and
17th centuries. “Historian,” as distinct from an annalist or a chronicler,
appears in the English language, according to the O.E.D., sometime between
1531 and 1645. Further development of our (until the 20th century, almost
exclusively Western) historical consciousness occurred in stages. During
the 18th century, history was regarded, and read, as literature. During
the 19th century, history was for the first time seen as a science. During
the 20th century, history was often considered as a principal social science.
(I believe that in the 21st century, history will become literature again:
but this will not be a reversal—rather, a contrary development: while academic
history may become less and less literary, all prose literature may become
more and more historical. But I am a historian, not a prophet.)
Back
to the 20th century, where we may detect (without much straining of our
eyes) a dual development. As an expectable and natural consequence of the
democratization of entire societies, the scope of historical reconstruction
has broadened. This widening involved many things: the study and the occasional
reconstruction of the history of majorities and not only of the ruling
minorities; the widening of the area of historical study and description
beyond the records of politics and of government; the adoption of methods
and of materials from the other “sciences” such as sociology, geography,
psychology, etc. The results have been mixed (for often they amounted to
not much more than the questionable data of retrospective sociologization),
but that is not my argument here. My argument is that we are still in the
presence of a dual development: for this by and large inevitable and even
commendable broadening of our historical perspective and study has occurred
together with a lamentable narrowing of some of the practices of professional
historianship, often to the extent that “academic” may be a more telling
adjective than “professional.”
Now
this widening of historical study and interest in our democratic age cannot
be but welcome. Let me cite one of my bêtes noires, from the
first edition of the Dictionary of the French Academy, published
in 1694, which defined history as “the narration of actions and of matters
worth remembering.” The eighth edition, in 1935, said much of the same:
“of acts, of events, of matters worth remembering.” Dignes de mémoire!
Worth remembering! What nonsense this is! Is the historian the kind of
professional whose training qualifies him to tell ordinary people what
is worth remembering, to label or authenticate persons or events as if
they were fossil fish or pieces of rock? Is there such a thing as a person
and another such thing as a historical person? Every event is a historical
event; every source is (at least potentially) a historical source. History
has now entered the democratic age, which simply means (the meaning is
simple, though its reconstruction is complicated) that the historian must
at least consider all kinds of events, and all kinds of people. However—this
obvious, and necessary, widening of the scope of historical knowledge and
of its subjects has occurred together with a frequent narrowing of professional
historianship.
In
1777, the first Ph.D. in History was granted in Göttingen. A little
more than one hundred years later such degrees were granted in almost every
country in Europe and in America, except for England following a little
later. That the professional qualification of historianship and the concept
of history as science rose together, marking the history of history during
the 19th century, is obvious. It had much to do with the German notion
of Wissenschaft, a term somewhat broader than the common English
concept of “science,” but of course there was much more to it. The merits
and the achievements, many of them long-lasting and still valid, of the
German-inspired though not exclusively German-discovered) Science of History
were enormous. Still, more than a century later we know—or ought to know—that
the strictly scientific concept and canons of history can no longer remain
sacrosanct, including such notions that professional history is restricted
to a study of documentary records; that such records are categorically
divisible into primary and secondary sources; and that proper historical
research and reconstruction will give us an absolute and incontrovertible,
unchangeable and unchanging truth about this or that portion of the past.
For
almost two hundred years now our knowledge of the past has been enriched
(and is still being enriched) by monographs produced by serious professional
historians of many kinds, even when these were (or still are) essentially
meant to be written by professional historians for other professional historians,
specialists in this or that portion of history. At the same time we ought
to remember that during the 19th century, the very century when history
had become professional, with many of its practitioners employed in universities,
the works of many noted professional historians were bought and read by
the educated public. This was so for Ranke, Treitschke, Sybel in Germany;
or for Michelet and Sorel in France; this was so not only for the great
nonacademic historians such as Macaulay in England, or for Tocqueville
or Taine in France; this was so in the United States, too, for such different
historians as Bancroft or McMaster or, somewhat later, Beard.
Even
then there were historians (Burckhardt or Mommsen, for example) who recognized
that history has no language and no method of its own; or that history
was art as much, if not more, than science. Then, during the 20th century,
the professionalization of history proceeded apace with the bureaucratization
of the profession. What happened often was not really a further separation
of professional from “amateur” historians but the devolution of many professional
historians into narrowly academic ones. By “narrowly” I do not mean specialists.
Good specialists are what we all badly need (and not only in history);
trendy and bureaucratic academics not at all. A good and honest specialist
is not someone who “knows more and more about less and less.” He is someone
who is seriously interested in his subject, even going beyond the frontiers
of his subject. The bureaucratic academic is the very opposite of that.
He is less interested in history than in historianship. His primary ambition
is his standing within his department and among his peers; he is ever ready
to adjust not only his ideas but the theme and even the subjects of his
work to what seems to be politic, respectable, or popular—within the academic
or intellectual world, that is.
The
harmful effects of the bureaucratization of professional historianship
need not be detailed or even summed up, except perhaps to say that they
involve not only questionable methods or even vocabularies but the very
personalities and the characters of their practitioners. That learned men
have their own petty vanities is nothing new; it should be known to all
literate people (read but Johnson’s “Rasselas”). But we also face, I think,
a relatively new phenomenon: the unsureness, rather than the self-satisfaction
of academics who wish to be properly reviewed and, on occasion, properly
published in a professional journal but who, at the same time, would give
their right arm to be printed or reviewed in The New York Review of
Books or even The New Republic—and not at all only for commercial
reasons. Something more than customary vanity is at work here. It includes
the uneasy, and, as yet, hardly conscious sense within the academy that
we are— surprisingly—living at a time of an unprecedented and, yes, popular
interest in history.
Toward
the end of the 20th century, indeed, of the entire Modern Age there developed
among the peoples of the world a phenomenon which was unexpected and unprecedented:
a gross appetite for history. This appeared at the same time when traditional
beliefs and habits and customs and traditional practices of education and
of artistic representation were vanishing. The evidences of this prevalent
and spreading appetite for history have been so multitudinous and protean
that merely to list them, or to sum them up, would take many pages. They
include matters such as the burgeoning of popular historical magazines,
of all kinds of historical societies, of all kinds of historical programs
on television and in the movies. What is even more telling: books about
history and biographies now sell much more and much better than do novels—
a reversal after 250 years when the novel first appeared as a form of literature.
Perhaps the most remarkable evidence of this tendency may be found in the
United States, whose popular ethos, the Novus Ordo Seclorum, had
a nonhistorical or perhaps even anti-historical tinge. I cannot, at this
point, and within this article, even speculate why this has been happening
(except perhaps to say that it has had nothing to do with the so-called
“conservative” movement, since this appetite for history amounts to something
deeper than a reaction against Liberalism or Marxism). Let me only say
that history is never of one piece: that currents on the surface, however
oceanic-looking and powerful, evolve at the same time with other, deeper
currents that are flowing in different directions. And we must also consider
that this, I repeat, is not a return to what happened 250 years ago when
people began to be interested in history as if that were a new and interesting
form of literature. Those readers still belonged to the small minority
of the upper and educated classes, whereas now an appetite for history
exists among many kinds of people in what are now increasingly classless
societies.
One
of the marks of this surprising, and encouraging phenomenon is that we
are now in the presence of something like a golden age of biography. Seventy
years ago Harold Nicolson, a superb biographer, speculated that biography
may have come to an end because of the scientific, rather than artistic,
exploration of the human mind. The opposite has happened. It is not psychoanalysis
but history that is now married to biography. One hundred years ago there
were not many who considered biography as a form of history; it was but
a form of literature, and there were not many serious historians who wrote
biographies. This is now past. What is more important is the appetite of
the reading public for all kinds of biographies —and equally important,
the condition that every serious biographer in our time feels compelled
to do his research and to compose his text in entire accord with the professional
practices of historiography.
Much
of this unprecedented and impressive growth of a popular appetite for history
has been ignored by the majority of academic historians. It has developed
at the same time when the disappearance of history courses, not to speak
of requirements, in high schools and even colleges was going on, and when
the gap between academic and popular history was widening. In The American
Scholar (Winter 1999) under the title “Teaching American History,”
a symposium on the National History Standards, I wrote:
Yes,
historical appetite among Americans is unprecedented and large. Of course
it is served, and will continue to be served, by plenty of junk food. Of
that professional historians may be aware. Of the existence of the appetite
for history they are not.
That
is the problem—the isolation of professional intellectuals, a largely self-made
isolation that is perhaps more extreme than it was in the past. When the
problem is nutrition, the recognition of the appetite comes first. Then
comes the recognition and the criticism of junk food, which is hardly corrigible
by the printing of a national cookbook—or even less by a pamphlet on National
Nutritional Standards with which, I am afraid, the present debate, with
all of its merits, is concerned.
Rereading
this I think that I should have written academic, rather than professional
historians, for at least two reasons. There are professional historians
who do recognize the problem; and who are capable of writing history with
high professional standards and qualities at the same time when their narrative
prose is, at least potentially, popular. The other reason is the existence
of the works of many non-academic historians and biographers whose lack
of an academic affiliation is seldom sufficient to categorize them as being
non-professional, or “popular.” If there is a division, (and that is not
always clearly ascertainable), it is not between professional and popular
historians but between creative historians and academic historians, or—perhaps—between
good historians and bad ones. That a good poet is someone who must have
a Ph.D. in Poetry is of course absurd (though, here and there, we are getting
to that too). That one must have a Ph.D. to be a good historian is less
absurd: but, still . . . .
Yes,
appetite is a good thing, a gift from God. But the value of things depends
on their husbanding; and given human frailty, the recognition of an appetite
immediately calls forth all kinds of people ready to profit from it. Yes,
we are in the presence of this great and unexpected appetite for history.
But we also have a host of its gross providers. And what appears, too,
is the temptation of able historians to adjust (and frequently lower) their
standards of work when the living comes easy. There is the recent example
of Stephen Ambrose. There is the example of Barbara Tuchman, whose The
Zimmermann Telegram was a first-rate and unexceptionable study in military
and international history: but then, whatever their popular successes,
her subsequent works did not quite represent such standards, until her
last celebrated volume, The March of Folly, showed insufficient
reading and a host of questionable assertions. There are the examples of
some of the books of historians among us who during the last thirty years
or so have successfully translated themselves from England to these shores.
(Kennedy, Keegan, Schama come to mind.) I began this article by citing
David McCullough’s The Path Between the Seas as a sterling example
of “popular history” that was (and remains) much more than merely “popular.”
Yet his Truman and Adams biographies, successful as they are, and with
their research following a number of professional standards, do not quite
compare in their qualities with his earlier books. (Robert Ferrell’s Truman
biography, in my opinion, is richer and better than McCullough’s—and it
is half the latter’s size.)
But
then this is a human predicament. With the still rising flood of all kinds
of historical writing and pictorial stuff, we ought to be aware of how
junk food can feed and even satisfy appetites. I am not, for a moment,
arguing for the abolition of the Ph.D., or of professional or even “merely”
academic historianship. I often think that the main task of academic historians,
now and in the future, is to recognize and describe and point out twistings
and falsifications and other mistakes and shortcomings in this or that
kind of historical representation. But that kind of effort must be more
than academic snobbery, more than a result of conscious (or unconscious)
envy, more than a suggestive assertion that historical truth is a matter
for professional historians, their restricted province. What we must recognize
is that the purpose of history is the reduction of untruth. And is it not
at least possible that the present appetite of all kinds of people for
history may be largely due to their, perhaps hardly conscious, dissatisfaction
with the large and heavy clouds of untruths hanging over the world, affecting
our very lives in this age of mass democracy, of a—so-called—“Information
Age”?
John
Lukacs’s latest book is At the End of an Age (Yale University Press, 2002).
Join
the Historical Society and subscribe to Historically Speaking
The
Decline of Popular History?
by
John Wilson
Sean
Wilentz’s essay-review, “America Made Easy,”[1]
calls to mind a favorite plot device in the sitcoms of the 1950s. The husband
wants to do something—go fishing with his next-door neighbor, maybe, nothing
nefarious, nothing illicit, but for one reason or another he can’t simply
tell his wife that he is going fishing. So he invents a pretext for the
trip: he’s going to visit his cousin, Elmer. Cousin Elmer? The wife doesn’t
remember any such person—and for good reason: he doesn’t exist. And anyway,
what’s the urgency? I thought you were going to clean the gutters this
weekend. Clearly the story needs to be embellished further. Cousin Elmer,
it turns out, is on his deathbed . . . . And so on and so on, as an ever-more
fantastic edifice is constructed, until it all comes crashing down.
Sean
Wilentz wants to go fishing. He is reviewing David McCullough’s biography
of John Adams, but his real subject is larger. Wilentz wants history that
will “rattle its readers, not . . . confirm them in their received myths
and platitudes about America.” He wants historians with a “taste for ambiguity.”
Fair enough. On these counts, McCullough is tried and found wanting. His
“vivid and smooth prose” goes down all too easily, and his “obsession”
with the personal character of his subjects (Truman, Teddy Roosevelt, Adams)
diverts attention from what really matters.
There’s
plenty of material here to fuel a usefully provocative essay. Is McCullough
as breezily complacent as Wilentz suggests? Is character more important
than he allows? How should America’s story be told, and how is
it being told? Point us in the direction of books that offer something
more than “America made easy,” history that comes up against the resistance
of the real, history that gives us a richer and truer understanding of
the making of Americans.
It’s
a marvelous subject, but having raised it Wilentz proceeds to evade and
obscure it. He constructs a grand narrative—the decline of popular history,
with McCullough as admonitory example—that in turn compels him to construct
assorted side-narratives, such as “a renewed rage for historical fiction,”
which “has produced what amounts to a fresh round of costume-drama Americana.”
And like the tale spun by the hapless sitcom husband, Wilentz’s ramshackle
argument grows ever more flimsy the more he adds to it. McCullough, the
Civil War according to Ken Burns, “Simon Schama’s erudite and jolly and
empty traversals through everything from a 19th-century murder mystery
in Boston to the history of Britain since 3500 B.C.”—all symptoms of the
same decline, “popular history as passive nostalgic spectacle.”
But
wait a minute. What or where did we decline from? And how did it happen?
Wilentz answers the second question first. The kitsch we’re wallowing in
can largely be explained as a “reaction against the new departure, the
new stringency” with which Richard Hofstadter and his generation of historians
reassessed American history. Giants walked the earth in those days: “From
the 1950s through the 1980s, American historians devoted themselves to
a remorseless re-examination of the nation’s past.” It was, in C. Vann
Woodward’s phrase, “the age of interpretation.” Alas, “over the last decade
or so,” what was once bold revisionism became the new orthodoxy, small-minded
men and women began churning out tedious and unreadable monographs, the
reading public grew disgusted, and—presto!—the Great American History Circus
was in business, with David “Buffalo Bill” McCullough as ringmaster.
If
you think I am caricaturing Wilentz’s argument, go back and read his article.
But even taken on its own terms, his account is incoherent. During the
very period in which his heroes were remorselessly going about their work,
pop accounts of American history, rich with sentimentality and nostalgia,
flourished on every hand, not in reaction to Hofstadter & Co.
but on a parallel track: in popular history (including the Landmark books
I read as a boy), in reenactments of historic battles, and in the massive
novels of James Michener, the multi-volume sagas of the indefatigable John
Jakes, and a host of other fictions. What was Roots, the TV epic
of slavery, but the supreme instance of all that Wilentz deplores in popular
history?
Contra
Wilentz, there has been no “revival” of popular histories of America; they
never went away. But perhaps there has been a decline in the quality
of the telling? Here we come to Wilentz’s answer to the first question:
What or where did we decline from? The answer is actually whom:
from Bernard DeVoto. In Wilentz’s account, DeVoto as a historian was all
that McCullough fails to be. DeVoto was serious and spiky, not merely vivid
and smooth, and he “recognized . . . the slightly mad strain in all of
American history.”
All
right. Let us suppose, just for the purposes of argument, that Wilentz’s
high regard for DeVoto and his criticism of McCullough are equally well
founded. What would that prove about the “decline in popular history?”
Absolutely nothing. One might just as well prove the superiority of our
popular history by comparing a jingoistic tract from the 1930s with Evan
Connell’s Custer book, Son of the Morning Star—a bestseller by a
non-academic, a superb writer who certainly possesses in full measure that
“indispensable sense of American strangeness” which, Wilentz laments, “seems
to have disappeared among our leading popular historians.”
The
arbitrariness of the comparison between DeVoto and McCullough infects Wilentz’s
entire essay, and the entire narrative of decline is nothing but a distraction.
Yes, there is plenty of dreck in the American history history section of
bookstores these days, some of it leaning toward the “academic,” some of
it aspiring to the “popular.” There is also more good stuff than anyone
has time to sample, let alone thoroughly digest.
You
want American history? How much time do you have? Here’s a small stack
of books, none of them calculated to reinforce “received myths and platitudes
about America,” heaven forbid: Arthur Quinn’s Hell with the Fire Out:
The History of the Modoc War, Bill McClay’s The Masterless: Self
and Society in Modern America, Ed Larson’s Pulitzer Prize-winning history
of the Scopes Trial, Summer for the Gods, good to read alongside
David Hollinger’s Science, Jews, and Secular Culture, Charles Alexander’s
biography of that early baseball titan, John McGraw, and Jean Bethke Elshtain’s
biography of McGraw’s contemporary, Jane Addams and the Dream of American
Democracy. And why not add to the pot The Kingdom of Matthias: A
Story of Sex and Salvation in Nineteenth-Century America, coauthored
by Wilentz and Paul E. Johnson, a superb narrative. Does the list seem
random? That’s just the point. Anyone who presumes to generalize about
the current state of American history-writing is likely to be constructing
the case with all the finesse of a Chicago homicide squad. The evidence
doesn’t fit? Hey, we’ll make it fit.
One
of my best friends ran a used-book store for years until the Internet drove
him out of business. He has a B.A. in history from the University of Illinois,
and he has read dozens upon dozens—perhaps hundreds—of books about the
Civil War, many of them intensely narrow in their focus. He loved Ken Burns’s
PBS series, not because he has a naive or nostalgic conception of that
terrible war but because Burns and his team told the big story in a way
that the mind could hold, with images and music that complemented the millions
of words he’s read. What a viewer or a reader gets from the work of Ken
Burns or David McCullough depends in part on what he brings to it. Reality
is messy, and Sean Wilentz is right to call for history that acknowledges
the resistant particularity of its subject. Too bad he didn’t follow his
own advice.
John
Wilson is the editor of Books & Culture.
[1]
“America Made Easy: McCullough, Adams, and the Fall of Popular History,”
The New Republic (July 2, 2001).
Join
the Historical Society and subscribe to Historically Speaking
Tales
from the Dark Side: Confessions of a Social Scientist
by
Peggy G. Hargis
I first
learned that I had been awarded a National Endowment for the Humanities
Fellowship when a dear friend sent me a congratulatory email. I was astounded—in
part because I had not yet received official notification, in part because
I was not trained in history or the humanities, and in part because I am
a Southerner, and we never expect to win. You see, I am a sociologist by
training, a social scientist by choice, but I have become a historian by
the grace of my humanist friends.
Although
I never considered myself a vulgar empiricist, I was weaned on statistical
methods, and my initial research on African-American land ownership at
the turn of the 20th century was highly quantitative. It described the
trends and geographic patterns of African-American land ownership and identified
the characteristics of counties where blacks were most (or least) successful
in acquiring acreage. I found that blacks encountered fewer barriers to
ownership in some types of commodity cultures than in others, and that
the temporal shift from land gain to land loss was not as uniform or as
straightforward as historians had implied. I suspected that African-Americans
devised strategies for buying and keeping their land based on their local
surroundings, but I could not squeeze information about potential landowners,
their families, or their local communities from county-level statistics.
If I was to understand how community-based social, political, and economic
relations affected African-Americans’ opportunities for land ownership,
I would have to look at other kinds of information.
I wish
I could say that I knew then exactly where to go to find my answers, that
I understood precisely what the humanities had to offer, and that everyone—sociologists
and historians alike—enthusiastically encouraged my interdisciplinary interests,
but alas, that would be a gross misrepresentation. I quickly learned that
some historians suffer from what I call the definitive-work syndrome.
“Have you not read so and so’s book? The topic has been done.” In other
words, the trend of black land gain to land loss had already been adequately
accounted for. No need for a sociologist to poke her nose into it. Some
of my fellow sociologists were equally unimpressed and perplexed by my
interest in history. I was once introduced by a former professor to a potential
new-faculty hire. She quipped, “Peggy’s dissertation was historical, but
I’m sure she is beyond that, now.” She made history sound like a bad head
cold—something to be rid of as quickly as possible.
Though
I came to the humanities self-consciously, my progress was uneven. When
I began I had only the vaguest idea what a manuscript collection was, I
had never stepped foot in an archive, and the acronym, NUCMC[1]sounded
like something that would stain my clothes. I relied on the kindness of
my historian friends to guide me as I learned a new vocabulary, discovered
different types of sources, and practiced unfamiliar methods. I no longer
analyzed data; I interpreted sources. I no longer tested hypotheses; I
assessed competing arguments. Words replaced tabular summaries of regression
coefficients, and I struggled to write clear narrative prose, a style alien
to most sociologists. And I asked lots of pesky questions.
For
a long while I felt schizophrenic. I was convinced that at some point,
I would look up from reading a dusty manuscript to discover a stern-looking
archivist pointing her finger and whispering in a raspy tone, “Impostor!
Cast her out!” Yet, now that I am comfortable in my new skin, it is becoming
increasingly difficult for me to recognize the old battle lines that once
seemed so formidable. The rigid boundary between the social sciences and
humanities has become an elusive threshold that shifts in the sand. How
did this heresy happen? What common ground could the social sciences and
the humanities possibly share?
An
important venue for my crossing intellectual and disciplinary boundaries
came when a group of us organized a research network that included women
from anthropology, history, and sociology. We met every other week for
two hours to discuss our research and the challenges of producing scholarship
that we felt we, as women, encountered. Although we expected to support
and nurture each other in our research endeavors, I do not think any of
us could have predicted how much substantive help we would give one another.
Our research interests, which span intellectual, urban, and women’s history,
European cultural anthropology, 19th-century Southern archeology, and race
inequality, seemed too disparate for us to expect tangible payoffs. But
we were wrong.
Now
into our second year, we continue to learn a great deal from each other
about substance, as well as process. Since we are not necessarily familiar
with the conventions, methods, or literature outside our own fields, we
cannot assume that others in the group will blindly accept the “orthodox”
explanation, whatever that might be. Talking across disciplinary and intellectual
borders forces us to make our assumptions more explicit, our arguments
clearer, and our standards of evidence more readily apparent. Our different
approaches and backgrounds also mean that we can recommend articles, books,
and potential funding sources that might otherwise be overlooked. In short,
we learn from our differences.
My
journey across the humanities/social sciences divide did not begin and
end with research, however. I had the privilege of coteaching two interdisciplinary
courses at Georgia Southern University—an uncommon opportunity at my school.
I first taught with a cultural anthropologist and later, with a cultural
historian. Team-teaching across the humanities/social sciences divide flavored
our classes in ways that made them distinctly different from our discipline-specific
courses.
Because
there were two of us in the classroom, we did a better job of fielding
students’ questions. Each of us raised points that the other had not immediately
thought of, and this made it easier to shift from one line of reasoning
to another if a student was having difficulty understanding a concept or
topic. We drew from our own disciplines to illustrate ideas or events,
though our different intellectual backgrounds also meant that, sometimes,
our views of key issues were at odds. We listened to and challenged each
other, and our public disagreements gave students the opportunity to emulate
productive intellectual discourse. Not only did I learn new information
from my colleagues, but the familiar took on new meaning when I heard it
presented from a different vantage point. In my experience two heads and
two intellectual perspectives proved to be better than one.[2]
But humanists and social scientists need not teach together in the same
classroom to share common goals.
Courses
in the humanities and social sciences are designed to teach students to
think. We all recognize that in this age of sound bites, instant messaging,
live news reports, and Internet surfing, it is more important than ever
for students to be critical consumers of information. To make sound and
reasoned distinctions between truth and falsehood, judgments about right
and wrong, or evaluations of almost any sort, students must possess a host
of intellectual skills and abilities that in combination are commonly called
critical thinking. To think critically demands that students give fair,
reasoned consideration to evidence, context, and methods, but it also necessitates
that they be open-minded enough to subject their personal opinions and
beliefs to the same sort of penetrating examination. This takes time and
lots (and lots) of practice. There are no shortcuts, quick guides, or easy
avenues to becoming critical thinkers. The process is incompatible with
instant gratification.
If
students are simply looking for the credentials that will land them a good
job and an easy lifestyle, then they will likely complain that studying
the humanities or social sciences is a time-consuming, and perhaps a time-wasting,
enterprise. It has, as they will be quick to point out, a poor dollar return
for the time spent. “I’ll never use this information in my job.”
“What, we have to read the whole book?” “Why did I get a C on this
paper when I spent an entire afternoon writing it?” “My teacher
never just tells me the answer.”
The
humanities and social sciences are a speed trap on the information highway.
Yet they are also perfect venues for teaching and learning the very skills
that students will need the most, whether they recognize it now or not.
Classes in the humanities and social sciences teach students to think critically,
argue logically, and write clearly. All of us want our students to recognize
the underlying assumptions of arguments, to understand how patterns of
thought and knowledge are directly influenced by political, economic, cultural,
and social structures, and to articulate opposing viewpoints or propose
alternative solutions to problems. We may share a common ground inside
the classroom, but what about the practice of research?
I am
gutsy enough (though some might argue misdirected) to believe that social
scientists can learn much from their humanist brethren, and vice versa.
I know I have. I would even go so far as to argue that I am a better researcher
because of my interdisciplinary experiences outside the mainstream of sociology.
But I am not entirely repentant. My research perspectives and habits are
still unapologetically social scientific. I still look for the overarching
patterns, compare competing explanations, use numbers, and make my assumptions
and, thus, my arguments, explicit. But I am also well aware that methodological
techniques cannot purge the research practice of unrecognized assumptions
or biases, that social structures are rooted in human actions and behaviors,
and that the details of human experience matter. Postmodernists, who put
quotation marks around words like truth and facts, would think me oldfashioned
because I still believe in evidence and causality. But then, so do many
humanists.
I came
to history on a path less traveled —by way of sociology, statistics, and
theoretical models. But good research, whether in social sciences or the
humanities, has more in common than we sometimes care to recognize. Crossing
disciplinary and intellectual borders strengthens our methodological arsenal
and broadens our understanding of the social world. Whether we begin with
small clues or an explicit theoretical model; whether our sources come
as columns and rows of numbers or dusty manuscripts; whether our interests
are international, national, or local in scope; and whether our findings
are summarized in tables, graphs, or words, should depend on what we want
to know, not on what disciplinary conventions dictate. Perhaps it’s time
for both humanists and social scientists to stop fighting the barbarians
at the gate, and welcome the other in.
Peggy
G. Hargis is an associate professor of sociology at Georgia Southern University
in Statesboro, Georgia. She is completing a book on African-American land
ownership, entitled After the Whip: The Rise and Fall of the Black Yeomanry.
_______________
[1]National
Union Catalog of Manuscript Collections.
[2]For
more about team-teaching and critical thinking, see Georgina Hickey and
Peggy Hargis, “Teaching Eighties Babies Sixties’ Sensibilities,” Radical
History Review 84 (Fall 2002), forthcoming.
Join
the Historical Society and subscribe to Historically Speaking
Are
We Asking Too Much of History?
by
Allan Megill
In
a much-quoted statement in the preface of his Histories of the Latin
and Germanic Nations (1824), the young Leopold Ranke remarked that
"history has been assigned the office of judging the past, of instructing
the present for the benefit of future ages." Ranke demurred: his
work, he tells his readers, "wants only to say what actually happened."
Ranke was reacting against earlier views that gave history the task of
being a preceptor of life (historia magistra vitae, as the tag had
it), offering general rules by which we might guide our actions. He was
surely justified in rejecting the morally- and pragmatically-oriented approaches
to history that he found in earlier writers. First, such history could
only produce distorted representations of the past. Second, the claim to
offer lessons in ethics and prudence was fraudulent, for such-and-such
actions in the past were judged to be exemplary, but on the basis of pre-existing
ethical views. The resulting history was thus an exercise in false confirmation,
giving back to the present the present’s own prejudices dressed up in the
garb of antiquity. Third, this history misrepresented the present
as much as it did the past. Although his political stance was radically
opposed to Ranke’s, Karl Marx made much the same point at the beginning
of The Eighteenth Brumaire of Louis Bonaparte, where he notes how
images taken from the past obscure people’s apprehension of the world as
it actually is. Had it been relevant to his argument, Marx might also have
pointed out that the contrary is also true, for the search for lessons
from the past obscures our apprehension of the world as it actually was.
What
might a Ranke redivivus say about the tasks that are assigned to
history today? In Ranke’s day only a few, the highest elite, could do the
assigning. Today multitudes crowd the scene, both rulers and demos,
both producers and consumers of history, clad in varied clothing and clamoring
loudly. They include state legislators, intent on the right teaching of
history in the schools and colleges of their fine states; federal legislators,
horrified at the historical ignorance of college students and of the general
public; generous donors, intent on establishing a Chair for the history
of this and a Chair for the history of that; Americans proud of their ethnic
heritage or religion, who want to make sure that its glories are properly
represented and celebrated; veterans of wars, eager to see that the wars
in question are both correctly interpreted and piously commemorated; and
those persons bereaved by, or perhaps only touched by, all major disasters,
most recently the shocking events of September 11, 2001. Beyond these persons
or groups, who have an entirely explicit and active concern with history,
are others whose concern is more diffuse and consumption-oriented. I think
here of those who “appreciate” history in the way that one might appreciate
an appealing wallpaper pattern or a nicely manicured front lawn—who visit
presidential houses, stop at battlefields and at historical sites of other
kinds, perhaps pause in their travels to read historical markers, and in
general admire oldish things. Their appreciation sometimes goes so far
as to express itself in the assertion, “I have always loved history."
In
short, Clio has many friends and perhaps some lovers. But love and friendship
are not without their price. The fans clamoring round ask for more things
than the judging and instructing that worried Ranke. The tasks that are
today assigned to history seem to be of four types. First and foremost,
there is assigned to history the task of identifying-of creating and sustaining
identities of various kinds, and hence of making "us" (whoever "us" is)
feel good about ourselves. Crucial to this task is the related enterprise
of commemorating the actions and especially the sufferings of those individuals
and (especially) groups who are thus identified. Second, history is assigned
the task of evangelizing-of strengthening our civic religion. Third, history
is given the task of entertaining us. Finally, to history is assigned the
task, where possible, of being useful. To be sure, it is acknowledged that
history cannot be quite as useful as engineering, business administration,
and animal husbandry, and it is widely held as well that some history,
usually that which is distant from us in time, space, or culture, cannot
be useful at all. Hence this fourth task pales before the other three.
Am
I mistaken in thinking that these tasks-especially the tasks of identifying,
evangelizing, and entertaining -are widely assigned to history today? I
think not. Consider the distribution network for history books, a network
that, thanks to the Internet, is more visible now than in the past. Definite
claims and assumptions concerning history recur in the advertising that
emanates from such Web sites as www.amazon. com and www.barnesandnoble.com.
I have
perused advertisements appearing in several e-mail newsletters intended
by Amazon for history buffs (I have also looked in a more casual way at
the blurbs for best-selling history books that come up when one browses
the subject "history" on Amazon). I here reproduce the text of three advertisements,
drawn seriatim from Amazon's history e-mail newslet- ter of May 11, 2001:
Pearl
Harbor: The Day of Infamy-An Illustrated History by Dan van der Vat. President
Franklin D. Roosevelt famously declared December 7, 1941 as "A day that
will live in infamy," the day the Japanese attacked Pearl Harbor, pulling
the U.S. into World War II. This visually stunning book, by noted historian
Dan van der Vat, features groundbreaking research, over 250 images, including
previously unpublished personal photos from the perspective of both Americans
on the ground and Japanese in the air, as well as a moment-by-moment breakdown
of the attack. There are also numerous personal accounts, memorabilia,
and illustrations by Tom Freeman. A major achievement.
An
Album of Memories: Personal Histories from the Greatest Generation by Tom
Brokaw. As he's done in his immensely popular Greatest Generation volumes,
Tom Brokaw again celebrates the trials and triumphs of the Americans who
experienced the Depression and World War II. Album of Memories is a collection
of letters written to Brokaw by those who lived during this period, and
in some cases, their children. Complete with photographs and memorabilia,
the overall emotional impact of these letters is intense. To read them
is both a moving experience and an opportunity to experience history at
its most intimate.
Disaster!
The Great San Francisco Earthquake and Fire of 1906 by Dan Kurzman. Just
after 5 a.m. on April 18, 1906, an earthquake measuring 8.3 on the Richter
scale ripped through sleeping San Francisco, toppling buildings, exploding
gas mains, and trapping thousands of citizens beneath tons of stone, broken
wood, and twisted metal. Drawing on meticulously researched and eye-witness
accounts, Dan Kurzman re-creates one of the most horrific events of the
20th century. More riveting than fiction but incredibly true, Disaster!
is unforgettable history-a masterful account of the calamitous demise and
astonishing resurrection of an American city.
I do
not aim to score cheap points against works that no one expects to live
up to the standards of scientific history. I refer to these works only
because they reveal an orientation toward history widespread in present-day
American culture and perhaps also in modern culture generally. I also suspect
that this orientation has made inroads into the discipline, although in
a casual essay I cannot develop my points at length or justify them adequately.
The
idea that history might be ethically or pragmatically useful is absent
from the above examples. Nothing so distancing as an ethically normative
idea makes its appearance. The advertising for An Album of Memories suggests
that Tom Brokaw deals with "the trials and triumphs" of the "greatest generation"
in order to celebrate those endeavors and to give his audience a "moving
experience." Implied is that history can give us an emotional empathy with
past human beings, but what seems important here is the frisson of an immediate
relation. The claim that history might have a pragmatic utility is also
absent from these examples. But very few works of history dare to make
this sort of claim: they constitute a small and specialized genre, one
that usually emanates from people connected more to a policy-oriented political
science than to history. On the other hand, history's other contemporary
functions are all proudly on display, so closely intertwined in the minds
of the publicity people as to be almost indistinguishable. Most obviously,
an appeal is made to American identity, particularly in Pearl Harbor and
An Album of Memories. All the books evangelize, propagating what can be
best described as a can-do faith, one that affirms our capacity to overcome
the difficulties that life brings, even earthquakes. Along with the identifying
and the evangelizing comes the promise of great entertainment-we are being
offered experiences that are "visually stunning," "intense," and "riveting."
No
doubt the advertisements' highlighting of entertainment value is to be
expected, given that the books are all aimed at a mass audience. Obviously,
one cannot directly turn from popular history of this sort to the writing,
teaching, and other related activities of professional historians. There
are nonetheless convergences between the two spheres that make a reflection
on these popular modes relevant to the professional enterprise of historical
research and writing at the beginning of the 21st century. The most obvious
locus of convergence is to be found in the sphere of computer applications.
Such applications claim to offer something that is also very evident in
the advertising copy. They offer immediacy- or at least a simulacrum of
immediacy. Some of the more recent historical documentaries to be seen
on television make a similar promise (they differ from an older generation
of documentaries, such as Thames Television's 1973-74 series on World War
II, The World at War, where the presentation is much more depersonalized
and "objective"). A few mornings ago my dental hygienist, noting that she
"loved history,"mentioned the latest production by Ken Burns. New technology,
particularly Microsoft's PowerPoint®, makes it possible for the average
historian to be Ken Burns-like in classroom lectures. Soon, job candidates
who neglect to prepare lectures using PowerPoint will be putting themselves
out of the running for jobs at colleges where the central emphasis is on
teaching. A lecture on 16th-century Augsburg, for example, will be so much
more appealing to a lower-level undergraduate audience when the lecturer
can scan in lots of visuals and project them as needed.
If
I have to attend a lecture for lower-level undergraduates on a subject
not expected to appeal to them, I would rather it include visuals than
not.[1]
My concern is not to reject the technology but to draw attention to certain
dangers inherent in it. Consider again the identity-oriented attitude toward
history (with its heavy concern for commemoration and memorialization),
the assumption that a central task of history is to affirm and strengthen
our ideals, and the insistence that history should be entertaining. Consider,
further, the heavy emphasis that is given-at least in many colleges and
universities-to the histories of those entities that are thought to be
close to us (our own country, even our own region). Consider also the impact
on thinking of certain aspects of the new technologies. (For example, I
was recently struck by a presentation given by a "digital historian," who
took it as a given, without adducing any argument, that such-and-such a
debatable historical claim was true, and who then focused almost entirely
on how this truth could be most strikingly presented, in a Web-based form,
to the public.) And consider the ideological and interest- or identity-group
pressures that are exerted on history. To be sure, none of these orientations
and tendencies should be rejected entirely. History is, after all, an impure
science. Yet at the same time they need to be resisted.
The
core of the difficulty lies in the claim to historical immediacy. The claim
can be found almost everywhere. The advertising cited above uniformly suggests
that consumers of history will be brought into direct contact with the
action. Note how the advertisements focus on personal experience. We are
told that "one of the most horrific events of the 20th century"-the San
Francisco earthquake of 1906-will be "re-created" by the author. We are
told that reading letters written to a news anchor by "Americans who experienced
the Depression and World War II" is "both a moving experience and an opportunity
to experience history at its most intimate." The pho- tos in Pearl Harbor,
taken "from the perspective of both Americans on the ground and Japanese
in the air," likewise seem to promise a virtual reliving of the attack.
The book Pearl Harbor was issued at about the time that a "major motion
picture" focused on Pearl Harbor came out. The "tie-in" was intended. Pearl
Harbor the movie was not a historical documentary but a lame love story.
Still, it attempted to convey, in its sequence showing the attack on Pearl
Harbor, the impression of "you are there." Consider another entertainment,
the 1997 movie Titanic-likewise not a historical documentary, but a brilliantly
realized tragic romance. The producers of Titanic made a self-conscious
appeal to historical authenticity, for they went to enormous lengths to
duplicate the look of the original ship and its furnishings. Here historical
immediacy turns into an aesthesis of history, an attempt to get viewers
as close as possible to the sight and sound of historical reality itself.
The producers were on to something-they rightly intuited that, for a vast
audience, history, if it does not mean "dead and gone, irrelevant," means
the immediate representation of objects from the past.[2]
But
the promise of immediacy is not to be found only in popular history books,
or in the history put together by video documentarians, or in the historical
aspect of popular entertainments. It is also to be found in the work of
professional historians and of others connected with them (most often,
museum professionals). This is hardly surprising. After all, identifying
history is overwhelmingly concerned with what it takes to be "our" identity;
evangelizing history with what it takes to be "our" faith. Hence the attempts
that curators make at Colonial Williamsburg, Plimoth Plantation, and other
similar sites to reproduce the "look" and "feel" of life in an earlier
time. One sees the same impulse in some digital "archives," those driven
by a concern for making "everything" that exists concerning a given past
time and place available in a single set of Web pages, with the aim of
enabling the browsing public to enter into the life of a past community
or the flux of a set of past events. Let us leave aside the problems that
exist on the evidentiary level-above all, the quaint assumption that all
evidence relevant to explaining X in some given time and place is localized
within that time and place (whereas the universe of possibly relevant evidence
is in fact unbounded, its actual boundaries only determinable through continuing
argument). The deeper problem is the making of a promise that can never
be fulfilled. Consider also the growing number of projects that combine,
virtually in a single entity, features of the archive, museum, and memorial.
No doubt there is a kind of catharsis and reflectiveness that can be achieved
by standing in the place of (even taking the number of) a Holocaust victim,
and there is perhaps also the confirming of an identity. But again the
assumption of immediate identification is left epistemologically unjustified,
while, on the interpretive level, the stakes of the identification are
kept implicit, hence unargued, hence uncontroverted, hence undefended in
any full sense of the term. Consider, finally, a certain kind of historical
biography, the kind that aims at recreating what we might call "the inner
Mr. X" (for "Mr. X" substitute any historical figure). Never mind whether
a sense of Innerlichkeit is applicable to Mr. X in quite the way that it
is applicable to "us," "now."[3]
The
question with which I began was: Are we asking too much of history? At
first glance it might seem that we are. In wishing to "say what actually
happened," Ranke was not at all concerned with unveiling events as they
immediately felt to historical actors and sufferers. Rather, as he makes
perfectly clear in his 1824 preface, his aim was to give an accurate account
of "the beginning of modern history." To this project anything like the
present-day identity historians' concern for giving readers an immediate
"feel" of the past was utterly irrelevant, besides being inconceivable
for a man situated as Ranke was. But it seems to me more accurate to say
that present-day identity historians and their friends are asking too little
of history. For what is missing in the pursuit of the identifying, evangelizing,
entertaining, and utility functions of history is precisely the sense of
any fundamental break between present and past. In a certain sense the
history that is close to us is a dangerous history, precisely because it
is close. One cannot deny that history is tied up with present identities,
but to make those identities fundamental seems to me a fundamental mistake.
Rather than focusing on how history might create and sustain identity,
which in most circumstances can look after itself, and which in some circumstances,
such as those of religious or ethnic conflict, is a positive danger, historians
ought to attend more fully to the critical function of history. I do not
mean that historians ought to turn themselves into practitioners of critical
theory. What I mean is that they ought to attend more fully to the critical
dimension of the historical discipline itself. Rather than seeking to bolster
present identity by linkage to the past, historians in their critical function
ought to highlight what divides past from present. Here it ought to be
a matter not of showing continuity between past and present but rather
of showing how the past provides a reservoir of alternative possibilities,
of paths not taken, of difference. There is within all historical writing
the danger of a retrospective illusion: because the past followed a particular
path, we are inclined to think that it had to follow that path. In a globalized
society, wherein (at least at some levels) homogenization takes place,
it is important to articulate true representations that make vivid other
ways of understanding the world than the ways that are our own.[4]
Allan
Megill is professor of modern European intellectual history at the University
of Virginia. His latest book is Karl Marx: The Burden of Reason (Why Marx
Rejected Politics and the Market) (Rowman & Littlefield, 2002).
[1]Of
course, there are undergraduates and then there are undergraduates: perhaps
the proper description for the audience should be "people assumed to be
not very interested in history." Those of us who teach at institutions
filled with intellectually curious and lively lower-level students can
only regard with awe the efforts of those who have to try to generate such
curiosity and engagement.
[2]Somewhat
ironically, the attempted closeness actually underscored the historical
inauthenticity of Titanic, since the characters of the drama-their dress,
physiques, bearing, voices, language, class relations, desires, sexual
behavior, and aspirations -all had to be calculated according to what a
present-day audience would find understandable, empathetic, and interesting.
The characters thus clashed quite dramatically with the furnishings.
[3]Pity
the historical biographer nowadays who discovers that his subject has no
Innerlichkeit whatsoever. This may in part explain the disastrous fictionalized
biography of Ronald Reagan by Edmund Morris: Dutch: A Memoir of Ronald
Reagan (Random House, 1999). In any case, where identity is uppermost,
evidence tends to be cast aside as irrelevant-a fact that might also help
explain the lies that another historian and biographer, Joseph Ellis, told
about his personal history.
[4]The
theoretical perspective that I articulate here is in part suggested in,
and exemplified by, Michel de Certeau, The Writing of History, trans. Tom
Conley (Columbia University Press, 1988).
Join
the Historical Society and subscribe to Historically Speaking
Whither
Nineteenth-Century American History?
by
Daniel Walker Howe
How
should 21st-century America view 19th-century America? Every generation,
we have come to expect, reinterprets its past in the light of its own experience.
During the past 40 years we have witnessed dramatic changes in our discipline.
Most notable has been the rise of the New Social History with all its ramifications,
which include such varied and important developments as the use of gender
as a category of analysis, the redefinition of Western history as the study
of a region rather than a process, and a thorough transformation in the
prevailing view of Reconstruction after the Civil War. In the excitement
of the new, other modes of historical inquiry—political, diplomatic, economic,
military history—have been neglected or turned over to other disciplines.
In
presenting my views on where the study of 19th-century America might go
next, I am talking primarily about what I believe historians should do,
not trying to predict what they will do. Actually, I feel reasonably sanguine
about the prospects for 19th-century American history in the coming years,
and the chances that my hopes for the discipline will be realized seem
good. Unlike the early enthusiasts of the New Social History, I do not
repudiate the previous generation’s efforts or seek to jettison them.
We can build upon their achievement and that of earlier generations.
If
I could encapsulate my prescriptions for improving history into a single
phrase, it would be: avoid arbitrary divisions. One such unfortunate dichotomy
is built into the very definition of American history, and that is the
split between United States and all other history. From junior high school
on, students are taught history in two separate compartments: American
history and other (mainly European) history. Most college and university
history departments are bifurcated along these lines in their practical,
everyday organization. At the lower division level, the Americanists teach
introductory courses in American history, while the other colleagues teach
Western Civilization or World Civilization. U.S. history is seldom integrated
into Western Civilization or World Civilization. Many practicing historians
of the United States have relatively little training in other kinds of
history, and we tend to specialize more narrowly than historians in other
fields do. Too few of us try to keep up with work done in other areas of
history. Yet experience has shown that very often the best work in American
history has been inspired by work done on other countries, such as the
guidance the French Annales School provided to the New Social History.
American historians working in the United States compound their provincial
isolation by paying too little attention to the work done by historians
of the United States outside the United States. For example, the wide-ranging
and profound scholarship of the British historian William R. Brock on American
history is scarcely known in the United States. Recent initiatives by the
Organization of American Historians have undertaken to bridge this divide,
and I am pleased to see that historians working on the United States outside
of the United States no longer feel quite so marginalized by their colleagues
here.
The
organization of the historical profession around national identities is
and should be headed for serious modification. For American historians,
this means increasing attention to supranational contexts. Perhaps the
most important of these at present is the Atlantic World. David Brion Davis
and Bernard Bailyn have taken a leading part in defining this field, but
many other historians have enriched the study of economic, social, cultural,
and diplomatic contacts and conflicts among the countries and continents
bordering the Atlantic. The Atlantic slave trade, the Scottish-American
Enlightenment, and the Anglo-American community of humanitarian reform
are but three of the best known historical topics illuminated by the study
of American history in an Atlantic context. Although the study of the Atlantic
World has been carried out more by colonialists than by historians of the
19th century, in fact we have as much to learn from it as any. The 19th
century was not the time of American isolation that it sometimes seems.
Most economic and cultural developments of the Victorian age- whether one
is discussing the industrial revolution, the relations between science
and religion, the rise of the novel, or the women's suffrage movement-were
transatlantic in nature and cannot be understood in solely American terms.
As globalization and migration continue to integrate the economies and
cultures of the world in our own time, historians will and should take
it upon themselves to integrate their studies of national histories into
larger contexts.
Religious
history cries out for international treatment, if for no other reason than
that most religious denominations and movements transcend national boundaries.
Church historians have ranged more widely across lines of nationality and
time period than any other historians, and one hopes that the journal Church
History will continue to illustrate this kind of catholicity. Yet even
in church history comparative undertakings are a noteworthy exception rather
than the rule: for an instructive example, see George Rawlyk and Mark Noll,
eds., Amazing Grace: Evangelicalism in Australia, Britain, Canada, and
the United States (1994).
In
principle, the most useful of all contexts for the study of the United
States (and one of the broadest) would be that of white settler societies
in general. Louis Hartz's brilliant work, The Founding of New Societies:
Studies in the History of the United States, Latin America, South Africa,
Canada, and Australia (1964), has yet to be properly followed up. George
Fredrickson's comparative studies of the United States and South Africa,
White Supremacy (1981) and Black Liberation (1995), are among the few examples
we have. The continuing re-examination of America's alleged exceptionalism
by several scholars in different countries is welcome and should be encouraged.
Byron Shafer, ed., Is America Different? (1991) explains why the issue
will not go away. Whether one prefers to emphasize American exceptionalism
or to see the United States as an example of historical developments transcending
national boundaries, meaningful judgments require a basis of comparison.
Besides
national boundaries, another unfortunate division that hampers historical
understanding is the dichotomy between popular and elite. These ill-defined
categories (often simply fuzzy substitutes for class) can betray those
who would rely too heavily on them. It is not wise to assume that leaders
and followers are interested in different things, or that cultural forms
appealing to the rich wouldn't also appeal to others if they could pay
for them, or that people who might seem "elite" by one criterion (say,
education) are "elite" in other respects too (say, income or power). One
of the finest studies of American popular culture is David D. Hall's Worlds
of Wonder, Days of Judgment (1989). Hall comes to the conclusion that Puritan
clergy and laity in colonial New England evinced much the same anxieties,
beliefs, and attitudes; popular culture was not independent of, still less
adversarial toward, the official culture. Historians confronting 19th-century
religion sometimes assume that theology was only interesting to the elite,
but there is plenty of evidence that the people in the pews were as interested
in doctrine as those in the pulpit. Most foreign visitors to the United
States in the 19th century commented on the homogeneity of American culture
across class lines, even when they noted its variation across geographical
and ethnic lines. "Highbrow" and "lowbrow" are terms invented in the 20th
century that would not have been meaningful in the 19th century. Nineteenth-century
authors like Henry Wadsworth Longfellow and Harriet Beecher Stowe were
known and loved by middle classes and working classes; their writings were
part of the experience of English-speaking people all over the world. When
Queen Victoria received Longfellow at Windsor Castle, the servants crowded
about to get a peek at their favorite poet.
A
false dichotomy between "elite" and "popular" culture leads to the false
corollary that people who embrace approved standards of behavior are selling
out. The rise of polite culture, while associated with the rise of the
middle class, affected many working class people, just as it spread across
national boundaries. By adopting the self-discipline of politeness, one
could voluntarily join the middle class (as one could not through an act
of will join the European upper class). Politeness was a vehicle of social
mobility and a characteristic of an open society, but even more important,
it was a means of self-definition. To embrace politeness could be an assertion
of personal autonomy. In this respect, of course, it resembled signing
the temperance pledge and the evangelical decision for Christ. Because
of its unexplored implications for gender history, transatlantic comparisons,
religion, and many of the arts, the history of politeness is one of the
most promising fields of inquiry for the next generation. Historians should
build on works like Paul Langford's Polite and Commercial People (1992;
about England), Richard Bushman's Refinement of America (1992), and John
F. Kasson's Rudeness and Civility: Manners in Nineteenth-Century Urban
America (1990).
Another
misleading opposition, one that I encounter in my current work (a narrative
history of the United States from 1815 to 1848), is that between market
and subsistence agriculture. In the early republic, many family farmers
grew crops mainly for their own consumption. But after 1815 dramatic improvements
in transportation (turnpikes, canals, the steamboat, the railroad) enabled
producers to get their crops to distant markets better than ever before.
The area within which marketable crops could be grown expanded enormously;
cities could be fed; population grew and spread out. This so-called "market
revolution" had a major impact not only on the economy but also on society,
politics, and culture. It prompted white settlers to take land from the
Native Americans; it exacerbated competition between the sections over
whether lands newly opened up would practice slavery or free labor.
According
to Charles Sellers' big book, The Market Revolution (1991), subsistence
farmers only reluctantly responded to the improvements in transportation,
which threatened their security and the self-sufficient way of life they
loved. In his interpretation, the opposition between these family farmers
and the elite that participated in national and international markets formed
the basis for the political and economic conflicts of the age of Jackson.
The elite waged not only a political and economic offensive against subsistence
farmers, but also a Kulturkampf (as he terms it) to transform their attitudes
and values, even their religion, into something more compatible with commercial
capitalism. One problem with Sellers' model is that many farmers had always
practiced a mixture of subsistence and production for a local market. They
might distill some whiskey, or take raw wool or cotton for the wife to
spin into yarn, churn milk into butter, or send a daughter off to work
in a mill. Market activity formed an additional resource in family survival
strategy, as explained by Christopher Clark in The Roots of Rural Capitalism,
1780-1860 (1990). Subsistence farming had been pursued out of necessity
rather than conviction, and when better transportation opened up more market
opportunities, most farm families jumped at the chance.
The
politics of the market revolution focused to a large extent on transportation
projects ("internal improvements" in the language of the age). Tax subsidies
for internal improvements were controversial-not because people didn't
want to market crops, but because taxes were controversial. Many who opposed
internal improvements built by the federal government were only too happy
to support those funded by private enterprise or states, as shown convincingly
in John Lauritz Larson's new book, Internal Improvement (2001). Slaveholders
regarded federal funding for internal improvements with suspicion because
they distrusted big government and centralized planning (correctly perceiving
that these threatened emancipation). Opposition to a particular internal
improvement was more likely to reflect solicitude for slavery or preference
for a rival project than resistance to capitalist values.
There
was indeed a market revolution, and it demands historical illumination.
It rested not only on a transportation revolution but also on a revolution
in communications. Better technologies for printing and papermaking combined
with better distribution and a higher literacy rate to create an explosion
in the publication of newspapers, magazines, and books. Then, in 1844,
came Samuel F. B. Morse's famous message along forty miles of wire from
Washington to Baltimore: "What hath God wrought." The economic, political,
and cultural consequences of the rapid diffusion of information have been
vividly portrayed by Richard John in Spreading the News: The American Postal
System from Franklin to Morse (1995). The next generation of historians
should do still more with the impact of changes in communication, which
have, indeed, become even more evident by the 21st century through the
cumulative effect of successive technologies (radio, television, computing).
The history of the book, a most valuable subdiscipline, can be broadened
into a history of literacy and literate communication.
In
making an argument against arbitrary dichotomies, I do not wish to be misunderstood
as favoring the kind of bland consensus- mongering that makes the writings
of John Dewey (to take a notorious example) so tedious. There have been,
and remain, serious conflicts in American society. Our liberal values,
although hegemonic since 1776, have never been unchallenged. R. Laurence
Moore's excellent book Religious Outsiders and the Making of Americans
(1986) reminds one of just such fundamental value conflicts. The current
ideological assault on our values from outside should provoke a scholarly
reexamination of the liberal tradition, very likely reaching a more sympathetic
conclusion than some historians of the past generation have done.
I
believe that the time is ripe for synthetic treatments that will place
the achievements of the New Social History in perspective. The history
of oppressed and marginalized groups having been recovered, it is time
to bring them into the larger history. To study gender and race should
be to broaden and integrate our concept of history, not to narrow or segregate
it. The division between women's and men's history is at least as artificial
as that between American and British history. Women exerted influence on
American politics even when they could not vote, through writings, petitions,
and control of social occasions, for example. Black history is central
to the whole of American history, and should not be relegated to a ghetto
of its own. A nice recent example of synthetic history is Louis P. Masur's
brief book, 1831: Year of Eclipse (2001).
The
most unfortunate example of compartmentalization does not relate to the
New Social History and cannot be blamed on it. Of all historical specialties
probably the most isolated and therefore underutilized by general historians
is business history. Business history should have much more to offer. Michael
Zakim shows how readable business history can illuminate the social consequences
of the market revolution in Ready-Made Democracy: A History of Male Dress
in the Republic, 1760-1860 (forthcoming from the University of Chicago
Press).
The
final artificial division I wish to see overcome is that between academic
history and popular history. It makes no sense to deplore the public's
historical ignorance when we do not write for that public. A well written
narrative that recreates the excitement of history can appeal to a general
literate audience, as volumes by James McPherson and David Kennedy (among
others) prove. The great historians of the past wrote for general audiences
and made history a part of every educated person's view of the world.
I
have not chosen on this occasion to argue for the study of political, diplomatic,
economic, and military history, although of course all of them merit renewed
attention after a generation of neglect. As these subjects are revitalized-and
they will and should be-they will benefit from interaction with the new
developments in social and cultural history. The key to their renewal lies
in surmounting barriers of specialization, as I have been arguing for surmounting
barriers of nationality and classification. One of the things l like about
The Historical Society (and also about my other favorite professional association,
The Society for Historians of the Early American Republic) is the collegial
cross-fertilization among practitioners of all different kinds of history,
both traditional and trendy.
Daniel
Walker Howe is Rhodes Professor of American History at Oxford University
and a Fellow of St. Catherine's College. He is the author of Making the
American Self: Jonathan Edwards to Abraham Lincoln (Harvard University
Press, 1997).
Join
the Historical Society and subscribe to Historically Speaking
Democracy
versus the Other: Incompatibilities of the Modern World
by
Clark G. Reynolds
America's
democratic republic has been governed by, of, and for its people, a political
philosophy rooted in the humanistic primacy of the individual over the
state. American democracy is defined by free speech, religious toleration,
and free market capitalism. The socio-economic middle class lies at the
heart of the American liberal system.
As
a consequence, throughout its two and a quarter centuries American democracy
has proved less than compatible with all other basic forms of government:
tribal, feudal, monarchical, dictatorial. From the American perspective,
polities such as these may be labeled as the singular Other: autocratic,
authoritarian, absolutist, reactionary.
No
one can deny the fact that the overwhelming majority of peoples throughout
history, not simply the modern period, have been governed basically by
traditional, pre-modern one-man or oligarchic rule. This Other has been
characterized by a state church, centralized economy, landed or party elite,
large standing army, comparatively small navy, secret police, and numerically
huge but subservient peasant/worker class.
Indeed,
in the modern era (post-1500), only three major powers—all of them maritime-focused—have
succeeded in creating genuine alternatives to the Other. Not coincidentally,
these three were also the only true modern nation-states: (1) the Dutch
Republic, ca. 1590s to 1670s; (2) England/Great Britain, ca. 1680s to 1930s;
and (3) the United States, ca. 1940s to the present.
Hegemony
cannot by definition (from the Greek hegemonia, leadership) be shared,
land or sea. Making and enforcing international law has only been possible
by one nation at any given time. Thus the three great seaborne hegemons
of modern history, once they achieved great power status, competed or fought
each other for that supremacy. Over time, however, they joined forces in
order to preserve the modern institutions of their making.
Like
all nations after 1500, these three existed within the Machiavellian political
environment of nation-building in the modern period. The Prince,
published in 1537, promoted the quite traditional human practice of raw,
unbridled secular power in place of the idealistic but unfulfilled Augustinian
Christian ideals of ethics, mercy, and harmonious order. The three maritime-based
global hegemons were no strangers to such practices. Regrettably, for example,
they all participated in the slave trade and ownership.
But
it was Britain that initiated the global emancipation movement by outlawing
the slave trade in 1807 and then freeing its slaves in 1833, thence obliging
the rest of Europe, including the Dutch, to do the same. The United States,
however, could only settle the issue by civil war, in which the modern/industrializing
northern and western states defeated those of the traditional/agrarian
South. In 1890, at the height of the global British Pax, all Europe, the
U.S., Ottoman Turkey, even Iran and Zanzibar agreed, at least in principle
by the General Act of Brussels, to actively suppress slavery.
The
"modern" era did not really experience its "modern-ness" until the Dutch
effectively cast off Spanish rule by 1590 to create a federated republic
of the seven United Provinces. Based on a two-party political system, it
was the first modern state founded upon bourgeois capitalism, the inventor
of the privately- chartered corporation, and the architect of modern international
law, based on the principle of freedom of the sea. Its navy enforced and
protected these unprecedented institutions and even the small but pace-setting
Dutch army roundly punished continental enemies, in the process reducing
the level of European wartime violence and thereby reshaping organized
conflict until the French Revolution.
The
Dutch Republic became the first European country to accord civil rights
to non- Christians and separatists like the English Pilgrims (who lived
in Leiden before sailing to England in 1620 where they joined others aboard
the Mayflower bound for America). The Dutch Renaissance centered
on free speech, which gave succor to foreign intellectuals, not least the
exiled John Locke who while there in discussion with Dutch jurists wrote
his Second Treatise of Government. In science, technology,
education, publishing, and public services (orphanages, alms houses, hospitals)
the Netherlands led all the world. The glory of its new culture of bourgeois
individualism was preserved in the genre painting of the great Dutch Masters.
Why
so little attention, then, to the Dutch Golden Age in American college
history curricula and courses? And, by contrast, why so much play given
France, whose first four republics were ultimate failures as "modern" systems?
France simply never completely overcame its "traditional" cast until it
had ceased to be a great power.
Modern
Britain's hegemony began when the Glorious Revolution merged England and
the Netherlands under the common rule of the Dutch William and the English
Mary as constitutional monarchs beholden to Parliament. Holland's partnership
and great power status were over by 1715, and the British Empire with its
Enlightenment culture, global commerce, overseas colonies, and naval supremacy
began its well-remembered twocentury- long tenure.
By
the end of the 18th century the fledgling United States, aside from its
rejection of monarchy, was a virtual clone of the mother country. It even
fought a successful trade war alongside Britain against despotic Republican
France (the Quasi-War of 1798-1800) and unilaterally punished the feudalistic
Barbary pirate states of North Africa for capturing and ransoming U.S.
merchant ships and crews (1801-05, 1815).
Save
for the diplomatic blunder of siding with Napoleon in the War of 1812 and
much subsequent Anglophobic bluster, the United States enjoyed British
naval enforcement of the Monroe Doctrine. This enabled young, democratic
America to mature and expand westward without fear of great power interference.
The
Other of the 19th century was tribal, feudal, monarchical-the latter two
categories were even blurred in the case of Latin American governments.
Nomadic or semi-agrarian North American tribal peoples, like those the
world over for centuries if not millennia, fought endemic wars against
each other, often tragically resulting in annihilation-a phenomenon brilliantly
examined in Lawrence Keeley's War before Civilization: The Myth of the
Peaceful Savage (1995). Tribal encounters with migrating peoples of
Eastern and Western civilizations were no different except that these clashes
resulted everywhere in the inevitable defeat and demise of the uncivilized.
With
Britain, the U.S. supported the successful Latin American revolutions which
threw off Spanish and Portuguese colonial rule between 1807 and 1830. These
new nations, however, basically adopted traditional authoritarian rule
on the Iberian model, and this fact-whatever the specific issues and causes-has
invited U.S. opposition ever since: the Mexican-American War (1846-48),
the Spanish-American War to liberate Cuba (1898), the Big Stick interventions
to stabilize Caribbean economies (1901-33), and post-1953 involvement in
Central American and Caribbean countries, sometimes more or less related
to the Cold War or the war against drugs.
The
advent of the 20th century coincided with the maturation of the United
States into great power status-politically, economically, culturally, and
with a major navy. America's rise occurred as Britain's global hegemonic
Pax crumbled in the face of the rising Other across the oceans that wash
America's shores: Imperial Germany and Imperial Japan. After 1900 Britain
concentrated most of its fleet in home waters to deter Germany's, and it
built worldwide alliances and friendships to keep the seas open and its
colonies protected.
A strategically
alert President Theodore Roosevelt more than anyone understood that U.S.
solidarity with Britain and active American participation against the Other
were essential to the well-being of the United States in an increasingly
hostile world. "If this country . . . intends to do its duty on the side
of civilization," he proclaimed in a speech in 1905, "on the side of law
and order . . . then it must see to it that it is able to make good . .
. . What we desire is to have it evident that this nation seeks peace,
not because it is afraid, but because it believes in the eternal and immutable
laws of justice and right living." His modern view was not shared by the
aggressive Other.
In
World War I the Anglo-French-Russian Alliance held the line against Germany
until early 1917 when the latter unleashed its submarines against neutral
U.S. merchantmen, and Russian revolutionaries overthrew the tsarist autocracy.
President Woodrow Wilson seized this unique opportunity to try to eliminate
monarchy altogether by calling for the United States to join the Allies
in order to "make the world safe for democracy" as the only viable alternative
to the Other. America entered the war with crusading zeal.
When
the Bolsheviks toppled the makeshift Kerensky government later in the year
Wilson regarded-correctly-these brutal communists as a new manifestation
of the Other. At the start of 1918 he thereupon issued his Fourteen Points
designed to eliminate communism and all authoritarianism in favor of a
completely democratic post-war world. And in 1919 U.S. and Allied troops
intervened in the Russian Civil War against the "Reds."
But
neither the Allies nor Wilson's own people shared such a utopian-sounding
and by implication unrealistic vision. So they rejected Wilson's plan to
remake the world, abandoned the Russian "Whites," and took no official
leadership role in the reconstruction of war torn Europe. Hence the Great
Depression, the rise of fascism, the Stalinist dictatorship and bloodbath,
Japan's invasion of China, Hitler's Blitz and Holocaust, and the fall of
France in 1940. Britain stood alone in Europe's second World War.
Lesson
finally learned. President Franklin Roosevelt repeated Wilson's vision.
The United States, he announced at the end of 1940, must become the "arsenal
of democracy" -first by saving Britain and China as part of the massive
U.S. mobilization. Throughout 1941 the U.S. undertook a quasi-war against
Hitler's subs in the Atlantic, extended aid to Soviet Russia after Hitler
invaded it, and with the British called for Four Freedoms to embrace the
entire post-war world: freedom of speech, freedom of religion, freedom
from want, and freedom from fear. That is, put an end to the Other and
its basic causes forever by embracing democratic government, civil liberties,
the end of hunger, and the end of repressive authoritarian regimes. A very
tall order.
On
January 1, 1942, three weeks after Japan attacked Pearl Harbor, the United
States, Britain, and twenty-four other nations called for a post-war United
Nations to assure the Four Freedoms on a global scale. Then, during the
course of World War II, war-ravaged Britain yielded Allied leadership to
the phenomenally powerful and prosperous United States-as did Stalin's
Russia, reluctantly.
Victory
resulted in the conversion of defeated Germany and Japan into clones of
the U.S., massive aid to a devastated Western Europe, and the Cold War
against the Soviet Union as the major Other, followed by Communist China
and the satellites of both. The containment policy, forged by the events
of 1946-50, aimed at the destruction of the Other, focusing on the Soviet
Union as the "evil empire"-to use the later words of President Ronald Reagan.
In
1989-90 containment succeeded due to the inability of the U.S.S.R. to modernize.
Even during the course of the Cold War, however, the U.S. exerted hegemony
in the American Pax and labored, unevenly, to maintain stability between
peoples liberated by the collapse not only of the Axis empires but those
of Britain, France, and the Netherlands as well.
Then
came September 11, 2001. The Other, as always, comes in the same form:
tribal, feudal, dictatorial governments or factions which desperately cling
to traditional ways. The Other denies universal liberal education and generally
all modernity in order to hang on blindly to its rapidly eroding traditional
power.
Historians
can do no better than to understand the fundamental incompatibility of
democracy and the Other. The democratic United States has consistently
used its power to try to modernize the world. This has led to inevitable
clashes with the Other.
The
compelling tenets of American democracy have been the powerful thread running
through the history of the United States. The internal mechanics of democracy
have always centered on the art of compromise. But, by definition, democracy
itself cannot be compromised and expect to survive and grow.
Clark
G. Reynolds is Distinguished Professor of History at the College of Charleston.
His most recent book is Navies in History (Naval Institute Press, 1998).
Join
the Historical Society and subscribe to Historically Speaking
Jeffrey
Burton Russell: An Appreciation of a Career on the Edge of the Profession
by
Rick Kennedy
Jeffrey
Burton Russell is not dead. Although he retired from the history department
at the University of California, Santa Barbara in 1998 and a Festschrift
has been published, he is still writing. At present, he is working
on a history of the concept of heaven for Oxford University Press (Princeton
University Press published his previous book on the subject). He is probably
best known, however, for his five-volume history of the concept of the
devil (Cornell University Press, 1978-1988). These books are all still
in print with translations available in several languages. Russell began
his career with books on medieval heresies and witchcraft. Not surprisingly,
then, back at UCSB we graduate students knew not to bother him around Halloween—that
was when he would be busy with crank calls and requests for interviews.
Awarded
by his colleagues the honor of Faculty Research Lecturer in 1991, Russell
prefaced his presentation with a story of how his father was expelled from
the University of California, Berkeley in 1926 for blasphemy, pornography,
and bolshevism. Blasphemy, Russell explained, because his father published
in The Daily Cal a poll showing that a majority of undergraduates
did not regularly attend church; pornography for writing a story that included
a young woman in a “diaphanous gown;” and bolshevism for writing an article
condemning compulsory ROTC. Lewis Russell could have recanted and thus
been able to graduate; however, as his son explained, “he was a principled
heretic and refused.”[1]
Inheriting
the genes of a California heretic helps explain the career of Jeff Russell.
His students and readers have long found in him something mildly heretical,
out on the edge, and highly principled. Everyone with whom I had contact
in the history department had great regard for Russell as a person and
scholar, but there was giggling in the halls when he offered a graduate
seminar called “Truth.” Real historians were not supposed to waste a whole
seminar reading epistemology.
Russell's
books were sometimes harshly reviewed because they did not fit easily into
the normal categories of professional history. Some in our own department
did not really think he was "doing history." His histories of the concept
of the devil were about the development of people's perceptions of the
devil, but were also confusingly not just about the people-they were also
about the devil and the evil that people experienced. Russell was obviously
carving out something other than either the history of mentalités
or the history of ideas. He called it the history of concepts, but his
notion of concept carried more freight than people expected.
In
an article for The Historian entitled "History and Truth," Russell
wrote that "a concept consists of the tradition of the views of
what it is." Concepts can be very specific, such as "cat," or fuzzy, such
as "democracy." In the case of the former, the history of the concept was
not much needed to define it; whereas, in the case of the latter, the history
of the concept showed the extent of its undefinability.[2]
A concept "includes the affective as well as the analytical and has hazier
boundaries; whereas an idea is rational, a concept draws upon unconscious
patterns as well as conscious constructions. To the history of concepts,
myth is as interesting as philosophy."[3]
If
Russell had sat contentedly on that definition of concept, there probably
would not have been any giggling in the halls or harsh reviews. But Russell
dug deeper into his subject. He was not content simply to find evidence
of developing concepts; he wanted to use the developing concepts as evidence
for possible truths. Russell was quite unashamed of his belief that ineffable
truths exist and historians should try to get at them. Even more unfashionably,
he believed that the academic discipline of history was the high road to
those truths. "History is aimed at the truth," he proclaimed, and can give
us "the surest truth available to us in this world."[4]
It
is important to note that Russell did not say that historians attain
truth; rather, he believed historians have access to much that is intentional
toward truth. Since throughout history people have constructed concepts
to help them understand truths, the accumulation of concepts that develop
into long traditions indicate a human consensus about truths. Historians
have the "surest truth available to us in this world" because in the long
run of time people are getting glimmers of truth and the development of
those glimmers into dominant traditions offers evidence pointed toward
that truth.
All
historians have deep faiths that make it intellectually respectable to
muck about in the past. Russell's faith is that history is connected within
a cosmos that is not fully opaque to human understanding, and, like Descartes,
he must believe there is no malin génie capable of misdirecting
all human pursuits of truth. Only with such faith can Russell justify the
way he goes beyond the history of mentalités and history
of ideas into a history of concepts that actually exist as evidence for
the clouded truths of those concepts.
To
historians deeply committed to professional norms, Russell and his history
of concepts seem wacky. Certainly, as a student, I was confused as to where
Russell fit within the profession. It is probably too much to say that
he invented his method of handling concepts as evidence, but he recently
told me he is not sure of his sources. The footnotes in his books give
little evidence of working out someone else's ideas.
Two
books do, however, seem to be the major foundation stones of his career:
John Henry Newman's Essay on the Development of Christian Doctrine (1845)
and R.G. Collingwood's The Idea of History (1946). These two books
played a prominent role on the reading list of a graduate seminar I took
from Russell in the winter of 1982. The seminar's title- again raising
giggles in departmental halls- was "Tradition."
The
purpose of the seminar was to teach us to be highly mindful of the way
traditions work in history. Discussing Edward Shils's Tradition
(1981), Russell indicated that he believed people's perceptions of the
cosmos accumulate authority as they are reflected upon generation after
generation and become dominant traditions. Identifying multifaceted traditions,
seeking out the tenets at their cores, discovering their earliest beginnings,
documenting the accumulations over time and the detritus left behind, he
believed, constituted a type of evidence gathering for not only the existence
of a tradition, but also the possibilities of a truth that the tradition
pointed toward.
The
focus of our seminar was on the history of early and medieval Christianity.
Newman's Essay was one of our principal texts. Newman believed that
the history of the Christian tradition should be viewed as a revelation
with human historical authority. But the authority is dynamic. Russell's
definition of traditional concepts distantly echoes Newman's definition
of development:
This
process, whether it be longer or shorter in point of time, by which the
aspects of an idea are brought into consistency and form, I call its development,
being the germination and maturation of some truth or apparent truth on
a large mental field."[5]
For
Russell, a deeply committed Roman Catholic, Newman made sense of the dynamic
way Christianity should be defined. On the other hand, we should not make
too much of Newman when trying to understand Russell. Newman was a prickly
character who wanted to confirm specific dogma with historical scholarship.
Russell, on the other hand, is a tolerant soul who finds that the history
of traditions does not create strict definitions; rather, it opens historians
up to a greater range of subjects to study, wider understandings of human
experience, and glimmers of truths.
Newman
probably does not appear on many graduate historiography seminar reading
lists, but Collingwood does. With The Idea of History, Collingwood
tried to inspire the historical profession with the belief that we can
know the thoughts of people in the past and that the methods historians
use and the knowledge they attain should not be judged by the methods and
knowledge of the natural sciences. "The historian," Collingwood declared,
"is master in his own house; he owes nothing to the scientists or to anyone
else."[6]
Russell wanted his students to appreciate the freedom and challenge to
which Collingwood called us.
As
with Newman, Russell has major differences with Collingwood. Most importantly,
Collingwood wanted to emphasize the autonomy of a true historian who re-enacts
historical events and thoughts in his or her own mind. Russell emphasizes,
rather, the intimacy of communication with the people of the past and has
no commitment to Collingwood's theory of mental re-enactments. There is
a difference between the spirit and the letter when reading Collingwood.
Russell loves the spirit of free historical inquiry that refuses to be
bound by the standards of modern natural science. He is not interested
in the letter of Collingwood's specific philosophy of history.
Twenty
years after that seminar, Russell continues to explore the borderlands
of the profession using insights he gleaned from Newman and Collingwood.
His use of traditional concepts as historical evidence has continued to
evolve. Probably the most distinctive role he has played within the profession
in the last twenty years has been as our most wide-ranging believer in
the fullness of human communication. Russell has been increasingly adamant
that if a historian will listen with love, he or she will hear.
"History,"
Russell told his colleagues in his 1991 Faculty Research Lecture, "is the
study of the communion of saints-by which I mean the whole flawed, failing,
loving, human race past and present through space and time. The whole point
of history, and its great joy, is to encounter people in the past as real
people, to rescue them from oblivion, to restore them as living, fourdimensional
people. That means confronting them in their own terms, unblinkered by
any modern ideology or academic fashion."[7]
For Russell, historians must first and foremost perceive themselves in
communion with the people they study. Haughtiness, dispassion, and an unwillingness
to listen are as destructive to history as they are to a dinner party.
Only
by listening can we gain access to some of the fullness of how people experienced
life in the past. Russell's five books on the devil are unified by the
universal experience of evil. Russell has experienced it. The people of
the past experienced it. From the conversation of shared experience arises
the outline of how throughout the history of monotheistic religions we
personify "the prince of evil." Does this prove the existence of the devil?
No. But someone willing to listen who has also felt the cold of evil will
be opened up to the possibility, and maybe get an inkling of the truth.
"My
quest for an understanding of evil," Russell wrote in The Prince of
Darkness "has been a personal search as well as scholarly research
. . . knowledge without love, and scholarship without personal involvement
and commitment, are dead."[8]
Russell believes that the history of the concept of the devil can actually
lead to a better understanding of the real history of evil in the world.
He opens A History of Heaven with: "A normal human being longs for
three things that cannot be attained in this life: understanding of self,
understanding of others, and understanding of the cosmos. We cannot be
sufficient unto ourselves. We are created for the connection with others,
for the connection with the cosmos, for the dynamic connection among ourselves
and with God. When we ask for connection, we are often met by silence.
But if we listen, the silence sings to us."[9]
The
concepts of the devil and heaven as they develop are the manifestations
of a community's minds working through time. Only by listening to the whole
community does Russell believe we can fully hear. No one philosopher's
idea or institution's dogma encompasses the concept; rather, it is the
complex property of all who have experienced evil and perceived it in some
way personified, or experienced the sense of a joy that will come and reified
it as heaven. The sheer weight of so many for so long sharing similar perceptions
gives credentials to the possibility. Certainly no human concept can be
an exact and true description of some spiritual reality. But the shared
perception of a communion of people through time can only be dismissed
easily by the haughty, dispassionate, and unlistening.
Russell
does not reject the standards of professional critical history, but he
believes that such standards must be wielded with the vulnerable love that
enlivens and brings joy to a dinner party. The popular image of historian
as detective-the loner, usually jaded, most often distrustful-is beguiling
but wrongheaded. History is not powered by the skepticism of loners; rather,
it is powered by trust within a community.
Russell
gathers his evidence by communing with the people of the past. He listens
to what his sources tell him. Being a historian of religious matters, much
of what his sources tell him is about spiritual things. Herein lies Russell's
high commitment to listening: he grants the possibility that people in
the past may actually have communicated with God. Let us be clear that
Russell does not believe that the professional historian has a duty to
listen to God. But Russell believes we have to listen with openness and
love to those who say they have heard directly from God.
Russell
works on the edge of the profession. He is a mild heretic. He does not
think he has described the being of an actual devil or knows what heaven
really is like. He separates the fullness of his religious life from the
limits of a humanistic profession. What puts him on the edge, however,
is that he thinks history might offer an inkling of knowledge about personified
evil and paradise. What also puts him on the edge is that he doesn't find
it easy to separate his religious life from his profession. Russell believes
that the highest calling of a truly humanistic historical profession is
that it be intentional toward truth. "History," he declares, is "more than
an intellectual exercise: it is a sacred calling."[10]
Rick
Kennedy is professor of history at Point Loma Nazarene University. He is
the author of Aristotelian and Cartesian Logic at Harvard: Morton's "System
of Logick" and Brattle's "Compendium of Logick" (Colonial Society of Massachusetts
and the University Press of Virginia, 1995).
[1]Jeffrey
Burton Russell, "Glory in Time," Soundings 22 (1991): 44.
[2]Russell,
"History and Truth," The Historian 50 (1987): 9-10.
[3]Russell,
Satan: The Early Christian Tradition (Cornell University Press, 1981),
21.
[4]Russell,
"History and Truth," 13.
[5]John
Henry Newman, Essay on the Development of Christian Doctrine, I.i.5.
[6]R.G.
Collingwood, The Idea of History (Oxford University Press, 1946), 155.
[7]Russell,
"Glory in Time," 45.
[8]Russell,
The Prince of Darkness (Cornell University Press, 1988), xi.
[9]Russell,
A History of Heaven (Princeton University Press, 1997), 3.
[10]Russell,
Satan, 19.
Late
Ancient and Byzantine History Today
by Warren Treadgold
Around
1970, when I entered graduate school, my chosen subject of late ancient
and Byzantine history seemed to have a bright future. Though the field
had traditionally been strongest in Britain, by then nearly all major American
universities and even some smaller ones had Byzantine historians.
Harvard's opulently endowed Center for Byzantine Studies at Dumbarton Oaks
in Washington, D.C., had a half-dozen research professors and sponsored
several archeological projects and many publications. The standard history,
the revised German edition of George Ostrogorsky's History of the Byzantine
State, appeared in English translation in 1969, and important books
in English were published every year. Particularly notable were two massive
British projects (each over 1500 pages): A. H. M. Jones's The Later
Roman Empire, 284-602: A Social, Economic, and Administrative Survey
(1964) and a new Byzantine volume of the Cambridge Medieval History
by a team of British, American, and other historians (1966-67).
At
that time most scholars still thought of Byzantine history, like all of
ancient history, as part of Classical studies. The Byzantine Empire, though
it lasted through the Middle Ages, was after all just the Eastern Roman
Empire under a modern name (the Byzantines always called themselves Romans),
and its rulers spoke first Latin and then Greek. Most late ancient and
Byzantine historians still admired Edward Gibbon's Decline and Fall
of the Roman Empire, thought Byzantium represented a decline from the
early empire in many respects, and mainly wrote political history based
on literary sources. Such was Ostrogorsky's excellent History, originally
prepared in 1940 as a volume of a Classical handbook.
Yet
the field was developing. Jones's subject was how the later Roman Empire's
society, economy, and institutions functioned, and his sources were more
often saints' lives, orations, letters, papyri, or inscriptions than historical
texts. The Cambridge Medieval History volume was divided into a
part on “Byzantium and Its Neighbours,” which gave more space to the empire's
relations with other cultures than to its internal history, and a part
on “Government, Church, and Civilization,” which was thematic rather than
chronological. Both works showed a growing tendency to go beyond political
history and literary sources.
But
around 1970 the boom in American higher education ended. At first Byzantium
and Late Antiquity seemed to be keeping their share of the reduced job
market, perhaps because they linked ancient, medieval, Muslim, and Slavic
history, any of which a Byzantinist might teach. On the other hand, academic
fashions were turning against Classical studies as traditionalist and elitist,
and that could only be bad for the study of Byzantium.
Meanwhile,
however, the fashionability of Late Antiquity, and with it the earlier
part of Byzantine history, found a champion in the British historian Peter
Brown, who applied the poststructuralism of Michel Foucault to Late Antiquity.
According to Brown, people in Late Antiquity cared most about power, sexuality,
and rhetoric, and not really (as previously believed) God, morality, and
chariot racing. Brown attached far more importance than earlier historians
to living holy men (not dead saints), whom he saw as combining the roles
of politician, psychiatrist, and guru. Insofar as Brown referred to religion,
it was something vaguely New Age that he called "the holy." His work won
acclaim chiefly from historians outside the field, who found its depiction
of Late Antiquity intriguing even if they rejected poststructuralism in
their own scholarship. In 1977, Brown became a professor at Berkeley, where
his popularity increased.
Also
in 1977, Harvard made some major changes at Dumbarton Oaks, whose endowment
was then about a twentieth of the university's billion-dollar total. According
to wellinformed reports, the Harvard administration wanted to invoke an
escape clause in the original donors' will, which would allow Harvard to
use the endowment as it saw fit and in particular to move the Byzantine
Center to Cambridge. After partial disclosure of this plan attracted adverse
publicity, the newly named director of Dumbarton Oaks, the Harvard medievalist
Giles Constable, declared that there had never been such a plan. But he
soon stopped using the name of the Center for Byzantine Studies, abolished
its by now depleted faculty, and ended its archeological program. Constable,
whose own specialty was French monasticism, advocated a comparative approach
that would stress Byzantium's similarities to medieval Western Europe and
Islam.
Soon
afterward the academic popularity of Byzantine history began to wane, as
measured by positions retained or created in American universities. Byzantinists
departed without being replaced at Yale, Stanford, Columbia, UCLA, Indiana,
North Carolina, Texas, and elsewhere. Eventually no major American university
had a full-time, senior Byzantine historian training graduate students.
Dumbarton Oaks sponsored some joint appointments combining research fellowships
with assistant professorships at different universities, but only one of
these resulted in a permanent appointment in Byzantine history (at Rutgers).
Several of the most promising junior Byzantinists left academics. Late
Antiquity did little better, except for a few students of Peter Brown (who
moved to Princeton). Byzantine studies suffered particularly from the signal
sent by Harvard's treatment of Dumbarton Oaks. The new subsidized positions
seem to have reinforced impressions of a sick institution and a failing
field, especially because Dumbarton Oaks avoided any commitments that cost
much money or lasted more than a few years.
Efforts
by some historians, especially desperate ones seeking jobs, to follow either
the postmodernism of Brown or the comparative approach promoted by Constable
had, at best, mixed success. Brown's combination of a sense for academic
fashion with a selective use of the sources proved hard to imitate or to
expand upon. Much of what Brown wrote would have been sharply criticized
as unfounded or erroneous if published by anyone with less prestige. The
trouble with the comparative approach was that most of Byzantium's supposed
similarities to the West and Islam turned out to be superficial. The fact
that Roman rule and Greek culture survived in Byzantium but not in Western
or Muslim lands made Byzantine society profoundly different from both.
In the end, the comparative approach dissociated Byzantium from ancient
history without securing its place in medieval history.
Probably
the most favorable development in American Byzantine studies at this time
was the arrival of the productive and perceptive Russian Byzantinist Alexander
Kazhdan, who in 1978 took a position at Dumbarton Oaks without faculty
rank or administrative authority. Though the books he published in America
were all co-authored (one with Constable), his collaborators served mainly
to help him with his English. Finding comparisons of Byzantium with Antiquity
dangerous because they tended to imply that Byzantium was decadent, Kazhdan
avoided them by arguing that the Arab and Slavic invasions of the 7th century
had essentially destroyed the continuity of the empire, making way for
a new society and culture.
Kazhdan
had little interest in Peter Brown's work, or in comparisons of Byzantium
to the West or Islam. His main interest was the large and largely unfinished
business of learning about Byzantium itself. Kazhdan's great monument is
the threevolume Oxford Dictionary of Byzantium (1991). While many
scholars contributed to it, Kazhdan planned it in detail, wrote a large
share of its articles, and edited, revised, or rewrote many of the others.
The Dictionary reflects his vision and shows his strengths. Without
paying much attention to problems of continuity or decline, it illustrates
Byzantine cultural ingenuity and social resilience, and catalogues and
explains the insights and discoveries of Kazhdan's lifetime of research
on Byzantium. Without distorting the evidence, it makes Byzantium look
interesting and Byzantine studies look vigorous, though the vigor was mostly
Kazhdan's. Kazhdan died suddenly in 1997, leaving unfinished a History
of Byzantine Literature that is as illuminating a collection of information
as the Dictionary, if almost as episodic. Though his ideas have
been influential, they provide little overall framework for Byzantine history,
and even suggest that none may be applicable.
Meanwhile
publication of scholarship in late ancient and Byzantine history has continued,
largely without reference to postmodernism, comparative approaches, or
any overarching interpretive frameworks. Much of it, especially on the
European continent, consists of editing, translating, and commenting on
texts and creating research tools like catalogues and prosopographies.
This specialized work is often well done, and seldom distorted by scholarly
fashions. However, it seems always to be preparing for more general work
that, if it ever comes, is liable to be overwhelmed by more tools and information
than it can use. Therefore, for better or worse, a fairly small group of
British, American, and German historians are chiefly responsible for the
scholarship that shapes current conceptions of Byzantine history. The largest
part of it has been on Late Antiquity, the period from about 300 to 600
A.D. that may also be called early Byzantine. Some outstanding studies
have been published, notably by Ramsay MacMullen and Roger Bagnall in America,
Alan Cameron and John Matthews first in Britain and then in America, and
Timothy Barnes in Canada. These and other historians have built upon the
work of A. H. M. Jones (who died in 1970) to provide a detailed and rounded
picture of late Roman cities, government, culture, religion, and society
in general. Yet most of their work has more or less ignored private behavior
and beliefs, while Brown and his disciples have more or less ignored everything
else. The division has been unhealthy for both sides, because it permits
Brown's views to stand as an unquestioned orthodoxy in their own sphere
and prevents a full understanding of late ancient Christianity, which was
more than just a matter of power or psychiatry.
The
middle Byzantine period, from about 600 to about 1000, has developed another
orthodoxy that often goes unchallenged, this time a Marxist one. Its main
proponents are Paul Speck and Ralph-Johannes Lilie in Germany, John Haldon
in Britain, and their students. It begins not with a misreading of the
sources but with a radical distrust of them- as Speck once remarked to
me, "They all lie!" At its best, this attitude leads to criticizing sources
that are clearly tendentious. At its worst, it leads to dismissing sources
simply because they fail to fit such ideological preconceptions as skepticism
about religious motives, minimization of the importance of individuals
in history, or emphasis on the role of the state.
Haldon,
in particular, argues that Byzantium survived the 7th-century invasions
by increasing state control. Yet any coherent reading of the sources shows
that the state, having lost the greater part of its productive land to
the Arabs and Slavs, was forced to reduce its control, notably by giving
its soldiers state land to replace most of their pay. Haldon can avoid
this conclusion only by assuming that the army was much smaller than all
the sources indicate. His work, like Peter Brown's, has won acceptance
from some scholars who seem unaware of how much it depends on an ideology
they do not share. For example, the British historian Mark Whittow has
used Haldon's criticism of the sources for a large army to argue that the
Byzantine state was small and disorganized, though Haldon's reason for
disregarding the sources was to show that the state was large and powerful.
The
next Byzantine period, from about 1000 to 1200, has attracted a good deal
of interest recently. Traditionally historians had seen this as a period
of inexorable decline leading up to the empire's crushing defeats by the
Seljuk Turks in 1071 and the Fourth Crusade in 1204. Kazhdan and others,
notably Michael Angold and Paul Magdalino in Britain, have instead stressed
the growing strength of the empire's economy and culture throughout the
period. Yet, apparently out of reluctance to consider evidence of Byzantine
weaknesses that imply at least political decline, they give no clear explanation
of why this thriving state lost to the Turks and Crusaders. During the
final period of Byzantine his- tory, up to the fall of Constantinople in
1453, the political (though not cultural) decadence of the empire became
so obvious that no one feels able to deny it. Perhaps not by coincidence,
the period is the subject of that rarity in Byzantine studies, a satisfactory
comprehensive history, the British historian Donald Nicol's The Last
Centuries of Byzantium (rev. ed., 1993). Of course, a number of historians
have done important work on subjects not directly embroiled in the scholarly
controversies mentioned here. Examples in America are Clive Foss's studies
of Byzantine cities and forts, Mark Bartusis' work on the late Byzantine
army, and John Nesbitt's catalogues of the Byzantine seals at Dumbarton
Oaks.
Nevertheless,
if one follows the reigning fashions, late ancient and Byzantine history
looks something like this: a period of poststructuralist holy men from
300 to 600, a period of Marxist social forces from 600 to 1000, an economic
and cultural golden age from 1000 cut short by inexplicable defeat in 1204,
then a decline down to 1453. All this fits passably with Kazhdan's ideas
of "discontinuity" in Byzantine history, and seems to justify his lack
of enthusiasm for any general narrative history of Byzantium. Yet even
Kazhdan was bothered by the combination of some of these discordant elements,
and a few others have tried to make a little more sense out of the whole.
The
best alternative to Peter Brown's views is in the work of Cyril Mango,
who moved in the opposite direction from Brown, from America (Dumbarton
Oaks) to Britain (Oxford). The index of Mango's Byzantium: The Empire
of New Rome (1980) lacks entries for such subjects as "holy man," "oratory,"
"body," or "sexuality," which Brown considers vital but Mango finds of
little significance in Byzantine civilization. Mango's Byzantines are much
the same from Late Antiquity to 1453: conservative, religious, puritanical,
and inward-looking. Unlike Kazhdan and many others, Mango sees no reason
to deny that Byzantium declined. He may overemphasize the Byzantines' backwardness,
because what he says of most Byzantines was equally true of most ancient
Greeks and Romans, and most people in predominantly agricultural and illiterate
societies everywhere. But it is worth noting that most Byzantines disapproved
of sex, preferred dead saints to living holy men, and found formal oratory
incomprehensible. They would have detested Michel Foucault, especially
for his homosexual activities, which under Byzantine law were punishable
by castration in the 6th century and execution in the 8th century.
Another
leading generalist is the British Byzantinist Michael Hendy, whose lengthy
Studies in the Byzantine Monetary Economy (1985) has been more often
praised than read and used. It makes a practically conclusive case that
the empire's survival in the 7th century came about through a deliberate
reduction of the role of government, though by premodern standards Byzantium
continued to be an organized state. Hendy was the first to demonstrate,
in an article in 1970, that the Byzantine economy was thriving in the 11th
and 12th centuries, and long before. But he realized that the state often
failed to tap the strength of the economy, so that the state could go bankrupt
while its subjects grew rich. Hendy's book, though long, was meant only
as a preliminary study for a full history of the Byzantine economy, which
remains to be written and is much needed.
In
my own History of the Byzantine State and Society (1997), I tried
to tie up some of the remaining loose ends. I relied on Mango and Hendy,
and often on Kazhdan, but, as I have already suggested, no one could write
a coherent general history while following all the main secondary literature.
I discussed Byzantine weaknesses, even those that imply the dreaded phenomenon
of decline, and tried to consider both social and political history without
confusing the two. Thus I saw no contradiction in concluding that Byzantium
was declining economically in the 7th century but was rescued by competent
administration, or that Byzantium was expanding economically in the 11th
and 12th centuries but was wrecked by incompetent administration. Reviews
of my book have been relatively few and not particularly favorable, though
I was heartened by how few errors the reviewers found. The most common
criticism was that I disagreed with the secondary literature, especially
that written by the reviewers. The field seems still to be waiting for
a postmodern, Marxist comparativist who believes in discontinuity but not
decline.
What
wider issues can the fate of Byzantine and late ancient history during
the last thirty years illuminate? In the widest sense, it suggests that
the academic job crisis in America has led many scholars to follow academic
fashions that are irrelevant to their fields, to become apologists for
their subjects to the point of distortion, and to concentrate on selected
topics that sound interesting at the expense of a coherent larger picture.
Somewhat more specifically to Byzantine studies, neglect of the Greek and
Latin Classics and an unwillingness to take Christian beliefs seriously
have become obstacles to understanding Western civilization, whether premodern
or modern. What happened to Dumbarton Oaks, though even more specific to
Byzantine studies, is a far from uncommon example of what happens when
universities think of themselves as businesses managing resources, not
communities dedicated to scholarship.
Are
there any positive signs? At least in the study of Byzantium and Late Antiquity,
academic fads do seem to have become somewhat less popular as they lead
to intellectual dead ends. Harvard appears to have got little good out
of gutting Dumbarton Oaks, now mostly deserted except for underemployed
librarians and visiting European scholars. Recently the job market in late
ancient and Byzantine studies has improved slightly, as some universities
have found that Antiquity, the Middle Ages, Islam, and modern Eastern Europe
are hard to study apart from Late Antiquity and Byzantium, which join them
all together. Finally, the fragmentation of American higher education seems
to have gone so far as to provoke a longing for more coherence, which in
turn may attract more attention both to the Classics and Christianity and
to Late Antiquity and Byzantium.
Warren
Treadgold is professor of late Ancient and Byzantine history at Saint Louis
University and the author of several books on Byzantine history and literature.
Join
the Historical Society and subscribe to Historically Speaking
DISPATCH
FROM THE UNITED KINGDOM
J.H.
Plumb and the Problem of Historical Reputation
by
Jeremy Black
How
should we as professionals record and discuss the life, work and influence
of those who have recently died? Invariably, this is handled in a delicate
fashion with laudatory remarks directed towards the deceased. This arises
from a sense of loss focused more particularly by sympathy for those who
were close to the deceased. An aura of sanctity seems to descend in the
obituaries. Years, sometimes decades, later when their papers are opened
and scholarly study is directed thither a different picture can emerge.
At the same time, this passage of years can ensure that much of the personal
impression has been lost, and, more particularly, that the complexity of
the impact we all have on contemporaries is lost.
These
thoughts were thrust into my mind by the response to the death on October
21, 2001, of Sir Jack Plumb, Professor of Modern English History at Cambridge,
1966-74 and Master of Christ’s College, Cambridge, 1978-82. Plumb taught
and supported the careers of several prominent modern historians, and his
death provides an opportunity to see how historical reputation is constructed
and to pose the question of how we should respond.
Eulogistic
obituaries by Plumb’s former pupils appeared in the press. Simon Schama
ended his in the Independent by declaring: “Should history somehow
survive as the great art it has been … should it somehow keep a place in
the indispensable archive of our beaten-up world, it will be because Jack
Plumb wrote and taught and lived as he did.” In the Guardian, Neil
McKendrick stated that Plumb won his “war” with Geoffrey Elton: “The study
of history has marched irresistibly in the direction in which he predicted
and led.” Yet none of this captures the animosity inspired by a man spoken
of as evil by Richard Cobb, Jack Gallagher, and other major scholars of
their and Plumb’s vintage; nor, more seriously, did such obituaries explain
the tensions that this animosity focused on.
I was
puzzled by this contrast, and decided to write an alternative appreciation,
one that would focus on the scholarship, as, unlike the obituarists, I
have worked on most of the material used by Plumb and have also published
books on Walpole and the politics of his period. I intended to say nothing
about the man. I had met him, and will come to that, but did not know him
well and had no real view one way or other: although I was an undergraduate
in Cambridge, Plumb no longer lectured, and I left to do graduate work
in Oxford. Yet I could not help asking other scholars if they knew Plumb
and was struck by their responses. I was told story after story that I
shall not repeat here reflecting very negatively on Jack Plumb as a human
being. None of this may seem of any consequence, but it suggests to me
that the rivalry between Plumb and Elton discussed by the obituarists (most
fruitfully by the perceptive David Cannadine, another former protégé,
in the Institute of Historical Research's electronic Reviews in History
[December 2001, with an edited version in History Today February
2002]) should be placed as much in the personal as in the professional
field, and that his personality may have prevented Plumb from gaining the
position and prestige he avidly pursued, not least the Regius chair at
Cambridge and a peerage.
Position,
prestige; the third is patronage. There is no doubt that Plumb's skillful
and even ruthless use of patronage helped arouse the anger and maybe envy
of others. This determination extended across the Atlantic. I can recall
the head of one Ivy League department telling me about the "extraordinary
pressure" brought to bear to take a Plumb candidate. There was certainly
a malignity that was the other side of his active sponsorship of his own
reputation and the careers of his protégés. Others were abused,
damaged, and harmed.
To
offer a personal note: born in 1955, I was too young to have anything much
to do with Plumb. However, in 1978 I was summoned to see him in order to
be told there was no point persisting with my graduate work on Walpole's
foreign policy, as he was going to publish the third volume of his life
the following year and any other work would be redundant. Plumb offered
the very good suggestion that I work on 18th-century espionage, but he
must have known that he was lying about volume three. There is a difference
between attempting to finish a book and knowing, through the absence of
a finished text, that it is not forthcoming. Fortunately, I ignored his
admonition. Less fortunately, it led me not to consult him about my own
work on Walpole, which is a pity as, in his first volume, he did capture
the personality of the man and the issues that surrounded his rise to power.
Other graduates who were not his pupils also experienced his range and
rage. One friend of mine who had the temerity to publish a critical review
of a work by one of Plumb's protégés, now the holder of a
major post, was invited to dinner at Christ's and told that this was not
the way to secure a career in Cambridge.
On
the other two occasions I talked with Plumb I saw his variety. Taken by
Lucien Lewitter to dinner at Christ's and seated next to Plumb, I found
him a very interesting and agreeable man to talk with, although he did
not discuss the 18th century. Instead the novels of Balzac were the lead
topic. The other occasion was less favourable: with glass in hand at a
reception he was being unnecessar- ily dismissive about the abilities of
a protégé for whom he said he had helped swing a post.
Plumb
would not have wished to be seen as Walpole was by his critics-as a master
of patronage-but, rather, would have preferred his own portrayal of Walpole,
one that is in fact more accurate, as a statesman who had ideas as well
as interests, and supported policies as well as patronage.
Any
such comparison between scholar and subject cannot be pushed too far. There
were major differences in personality as well as role, although both had
the quality that in the past was described as parvenu: having gained wealth,
they liked to display it and to enjoy its fruits. As the obituarists noted,
Plumb lived life to the full, enjoying fine wines, collecting porcelain
and paintings, and offering a reminder of the princely magnificence shown
by Walpole in the great house he built at Houghton, where the profits of
office led, for example, to the yards of gold lace around the green velvet
state bed that so impressed the Duchess of Kent in 1732.
The
Walpole biography (1956-60) brought Plumb fame, but was never finished,
and remains curiously emblematic of Plumb's entire career. The two volumes
that appeared were popular works and attracted much attention, but their
long-term impact has been minimal. The account of the rise to power was
more innovative than the subsequent discussion of Walpole in power. Plumb's
other biographical work, his treatment of William Pitt the Elder, 1st Earl
of Chatham (1953), displayed a similar acuteness in the distinction of
character, but was a shorter and lesser work and had no lasting impact.
The
Growth of Political Stability in England, 1675-1725 (1967), originally
the Ford Lectures at Oxford, and Plumb's most significant book in the world
of scholarship, was important in its discussion of political structures
and shifts, and a world away from the minutiae of political maneuvers.
Plumb searched for an overarching analysis in a manner that Namier did
not pursue. However, he underrated Jacobitism and the post-1715 Tory party,
and his decision to concentrate on England, rather than the British Isles,
provided a misleading perspective, not least because developments in Ireland
and Scotland were crucial to the situation in England.
Thereafter,
Plumb's contribution to scholarship on the period was less than his sense
of his own importance might suggest. Plumb's published lecture on the commercialization
of leisure in 18th-century England was interesting, but he failed to follow
it up adequately, and the notion of a consumer society was pushed without
sufficient qualification. Instead, as with Plumb's work on Walpole, there
was a curiously unfinished feel to his career. He was an active lecturer
in America, a frequent reviewer, and the general editor of a number of
important series, which greatly helped him to provide patronage as well
as to try to mould the historical consensus, but his own work diminished.
In contrast, for example, to John Ehrman, who finished the third of his
major three-volume biography of Pitt the Younger, Plumb made scant impact
in his later decades. Possibly this sense of unrealized potential contributed
to his frequently acerbic character.
Despite
his successful attempt to write in a fluent and accessible fashion, Plumb
also lacked a popular touch. While Regius Professor, Sir Geoffrey Elton,
Plumb's great Cambridge opponent, took part in a student balloon debate
in which participants impersonated famous individuals and explained why
they should not be thrown from a rapidly falling balloon in order to save
their fellow travelers. Elton gave a solid performance as Henry VIII's
minister Thomas Cromwell ("his" Walpole). One of the students pretended
to be Elton. It is difficult to imagine Plumb, or any other senior professor,
taking part in such an occasion; or any other being thought noteworthy
enough to be impersonated by a student.
In
many respects, the academic world is driven chiefly by its internal politics,
but there is also a wider resonance that is of general importance. Academic
debate contributes to public culture and political controversy. It is also
greatly affected by a politics of personality, patronage, and, yes, intellectual
faction and engagement, that helps determine publication possibilities,
careers, and the perception of quality.
Plumb's
career shows this process at work, but it is one that is curiously difficult
to analyze. Historians anatomize others, but are not good at seeing the
role of personality and patronage in their own world. Talking with professional
colleagues, I was struck by their willingness to recount histories that
reflected no credit on Plumb, but also their sense that they should not
be mentioned, or at least attributed. I was told it would be personally
damaging to write such a piece. Among historians I admire who have read
this piece one remarked: "It needs saying, but you are brave." Another
wrote: "This is pretty strong stuff! I'm torn between thinking it ought
to be a matter of record and concern that people will think you mean or
badly motivated." On the Guardian website on November 3, 2001, Ian
Mayes discussed the critical responses to an item in the paper recounting
an instance of Plumb's notorious temper. Apparently "the editors of the
obituaries page have now been asked by the editor of the paper to consider
pejorative elements in obituaries and letters more carefully."
From
its outset, The Historical Society has declared that it "will be a place
in which significant historical subjects are discussed and debated sharply
in an atmosphere of civility, mutual respect, and common courtesy." No
doubt some readers may think I have violated this desideratum. But the
question I am raising is this: how are we as historians supposed to understand
people and their impact if we wait decades and shy away from any unpleasantness?
At the individual level, for Plumb there is no grieving partner nor any
bereft children. If obituaries are largely eulogistic and only those in
some sort of magic circle know otherwise, then haven't we as historians
failed? Or should historical reputation simply rest on assessments of scholarship
alone?
Jeremy
Black is professor of history at the University of Exeter and the author
of Walpole in Power (Sutton, 2001).
Join
the Historical Society and subscribe to Historically Speaking
All
the President’s Words Hushed[1]
by
Robert Dallek
Ever
since the presidency became the focus of U.S. political life during Theodore
Roosevelt’s years in the White House, journalists and historians have discussed
the importance of presidential decision-making. Why do presidents give
priority to one domestic issue over another? Why and how do they decide
between war and peace?
Journalists
initially answer these questions with the limited knowledge available to
them, always mindful that “White House sources” provide them with the information
that will advance a president’s agenda and serve his political standing.
Historians with the luxury of hindsight and, more important, access to
a much fuller record usually give us a better understanding of presidential
reasoning. Their studies are not simply exercises in academic analysis.
They often educate presidents, who are always eager to learn what accounts
for past White House successes and failures.
President
Bush, however, has severely crippled our ability to study the inner workings
of a presidency. On November 1, 2001, he issued an executive order that
all but blocks access to the Reagan White House and potentially that of
all other recent presidents. Practically speaking, Bush’s order hinders
the opening of 68,000 pages of confidential Reagan communications with
his advisors. Under the 1978 Presidential Records Act, a systematic release
of presidential papers in response to Freedom of Information requests can
only occur twelve years after a president leaves office. The law’s intent
was to assure the timely release of presidential materials that would serve
the government’s and the public’s understanding of the country’s history,
especially decision-making in the White House. The Bush administration,
including a statement by the president himself, contends that the executive
order is needed to guard against revelations destructive to national security.
But this assertion will persuade no one who has even the slightest knowledge
of presidential papers. Just a few days in the Kennedy or Johnson libraries
would be enough to convince anyone that ample safeguards against breeches
of national security and violations of personal privacy already exist,
and these are for papers dating from the 1960s, not the 1980s. Moreover,
access to previously closed documents makes clear that presidents and government
agencies always err on the side of excess caution.
If
national security is not the motivating force behind Bush's executive order,
what is? We can only speculate that he is trying to protect members of
his administration, who also served under Ronald Reagan, from embarrassing
revelations. It is also possible that he is endeavoring to hide his father's
role in the Iran-Contra scandal. And it is imaginable that he is already
thinking about shielding the inner workings of his own administration,
particularly his excessive dependence on senior advisors in deciding both
domestic and nationalsecurity issues about which many outsiders believe
he has been poorly informed.
Researchers
trying to reconstruct the country's past are not the only losers when access
to historical records is reduced. Current policymakers dependent on useful
analogies in deciding what best serves the national interest are also harmed.
The more presidents have known about past White House performance, the
better they have been at making wise policy judgments. President Franklin
D. Roosevelt's intimate knowledge of President Woodrow Wilson's missteps
at the end of World War I was of considerable help to him in leading the
country into and through World War II. Lyndon B. Johnson's effectiveness
in passing so much Great Society legislation in 1965 and 1966 partly rested
on direct observation of how Roosevelt had managed relations with the Congress.
President Harry S. Truman's error in crossing into North Korea was one
element in persuading George Bush not to invade Iraq.
The
recent release of additional Johnson tapes underscores how much historical
understanding can influence presidential decision making. Tapes of LBJ
talking about Operation Rolling Thunder, the systematic bombing of North
Vietnam begun in February 1965, reveal a president with substantial doubts
about the wisdom of the air campaign. "Now we're off to bombing these people,"
Johnson said to Defense Secretary Robert S. McNamara. "We're over that
hurdle. I don't think anything is going to be as bad as losing, and I don't
see any way of winning."
"Bomb,
bomb, bomb. That's all you know," Johnson said to Army Chief of Staff Harold
K. Johnson. " . . . . I don't need ten generals to come in here and tell
me to bomb. I want some solutions. I want some answers," the president
declared. "Airplanes ain't worth a damn, Dick . . . . [,]" he complained
to Senate Armed Services Chairman Richard Russell. "I guess they can do
it in an industrial city. I guess they can do it in New York . . . . But
that's the damnedest thing I ever saw. The biggest fraud. Don't you get
your hopes up that the Air Force is going to" win this war. "Light at the
end of the tunnel?" LBJ exclaimed to Bill Moyers about the bombing. "Hell,
we don't even have a tunnel; we don't even know where the tunnel is."
Johnson
knew about post-World War II surveys of wartime bombing effectiveness.
They demonstrated that the aerial campaigns against Britain and Germany
not only didn't defeat them; they, in fact, stiffened resistance and encouraged
greater civilian war efforts. Johnson's well-justified doubts about bombing
made him all the more receptive to sending in ground forces.
It's
too bad that he didn't have access to a memo President John F. Kennedy
had sent to McNamara in November 1962, a week after the Cuban Missile crisis
ended. An invasion plan for Cuba, which might still be needed if the Soviets
did not follow through on a promise to withdraw "offensive" weapons from
the island, impressed Kennedy as "thin." He worried that "we could end
up bogged down. I think we should keep constantly in mind the British in
the Boer War, the Russians in the last war with the Finnish, and our own
experience with the North Koreans." If historical experience dictated against
an invasion of Cuba, how would he have felt about sending hundreds of thousands
of troops into the jungles of Vietnam?
Every
president uses history in deciding current actions. President Bush is no
different. Memories of his father's defeat over a failure to keep his promise
about no new taxes and a seeming indifference to the plight of the unemployed
have partly shaped his behavior as president. Bush might profit from a
history of Reagan's dealings with former Soviet President Mikhail S. Gorbachev
by an independent scholar, which for the time being his executive order
forecloses.
Indeed,
the principal victims of Bush's directive will be himself and the country.
The order will inhibit independent study of the Reagan and first Bush presidencies
and will impoverish the White House's ability to make difficult decisions
in both domestic and foreign affairs during the next three years. The more
the country knows about presidential decision-making, the better it can
decide whom to send to the White House. The study and publication of our
presidential history is no luxury or form of public entertainment. It is
a vital element in assuring the best governance of our democracy. Congress
should reverse Bush's order as a destructive act that returns us to an
imperial presidency and robs us of our history.
Robert
Dallek is a professor of history at Boston University. He is completing
a biography of John F. Kennedy.
LETTERS
To
the Editors,
Because
Jeremy Black’ essay, “The Present Emergency” (February 2002) is, I think,
serious and not a parody, I would like to reply to a number of his more
astounding claims. Black writes, “There is no inherent reason why Islamic
society should be anti-American”—which apparently suggests therefore that
America might have done something wrong to subvert this natural affinity.
He then goes on to assert that the United States has more in common with
the Islamic world than with China, Russia, and Europe. Yet other than Turkey,
there is not a single, true democracy in the Islamic world. Women cannot
drive or vote in Saudi Arabia. Murder is state-sponsored in Iraq. There
is no habeas corpus in Libya. Theocracy, monarchy, autocracy, or military
dictatorship, whether in Iran, Kuwait, Saudi Arabia, Iraq, Libya, Algeria,
or Pakistan, are the way of the region, with all the tragedy that characterizes
such absolutism—lack of an independent judiciary, open press, religious
tolerance, civic audit of the military, and free speech.
In
contrast, nearly every European nation is, like America, democratic and
free, while Russia is undergoing a democratic revolution seldom witnessed
in recent history. Black seems to be unaware of this common adherence to
constitutional government, and so maintains that his proposed natural commonality
between America and Islam rests with a mutual “powerful affirmation of
religious values.” Forget about differences between Christianity and Islam
for the moment, and the suggestion that we may be fellow-fundamentalists,
and simply consider that one of the Muslim world’s chief complaints against
America seems to be precisely our own commitment to a sometimes godless
rationalism and religious diversity—and the dizzying pace of globalism
and modernism, which are the dividends of a free, capitalist, and secular
society, one largely immune from religious audit and state-sanctioned worship
so common in much of the contemporary Islamic world.
Black
adds that a successful strategy against the terrorists “requires a reexamination
of American policies in the Middle East that may well be impossible for
American statesmen and policymakers.” But why would such a reexamination
be “impossible” for a democratic society such as ours—one that reflects
the views of its voting constituents? If Black is referring to the Israel
question (“…because of pronounced support for Israel”), then he should
know that such American backing for Israel (consistently recorded at over
60% in polls) derives from its status as the only free and democratic society
in the region. Our policy is not the result of some mythical force that
would make our change “impossible”—but mirabile dictu based on old-fashioned
idealism rather than real strategic national interest.
More
importantly, there is no reason to believe that the events of September
11 had anything much to do with American policy in the Middle East, as
alleged by Black and, of course, on occasion, by bin Laden himself—along
with the terrorists’ other grievances such as “Jewish women walking in
the Holy Land” and general American decadence. Al Qaeda’s earlier attack
on the USS Cole coincided exactly with Mr. Clinton’s deep involvement
in Middle-East negotiations that resulted in an American-sponsored offer
to return 96% of the West Bank. True, in a recent poll Kuwaitis replied
that a reason for their overwhelming dislike of Americans (72% of those
surveyed) was America’s policies in Palestine. But then this is a populace
that a mere decade ago was saved from extinction by the Americans—and then
subsequently ethnically cleansed their country by deporting Palestinians
on sight.
Pace
Black, we need not necessarily believe the purported grievances of either
the terrorists in particular or the Islamic world in general against the
United States—who saved Islam from Russian atheism in Afghanistan, Somalis
from starvation, and Kosovars from murder—any more so than those hurts
voiced by Germany and Japan on the eve of World War II. As Thucydides reminds
us, states and parties can just as readily go to war over perceived slights
and irrational concerns such as “fear, self-interest, and honor ” as they
can over legitimate grievances.
Black
also has a disturbing habit of misrepresenting, or perhaps simply failing
to grasp, what others have written. Of John Keegan’s dozens of persuasive
columns published since September 11, we are told only that he writes “for
the most conservative of the newspapers” (was such a political characterization
used by Black in his reference to Felipe Fernandez-Armesto?). He then summarizes
Keegan’s views as “the dispatch of cruise missiles against those who send
encrypted messages through the Internet”—a single, out-of-context quote
that might imply that Mr. Keegan was advocating a Tomahawk against your
local credit card company for encoding your Visa number.
Black
also suggests that my past article displayed a careless use of historical
exempla. I was asked by Historically Speaking on Thursday,
September 13, to contribute a brief essay on the events that had transpired
two days earlier. I wrote and submitted the essay shortly thereafter, before
the response that commenced on October 7. At a time of general pessimism
concerning the proposed American action in Afghanistan, I wrote that the
general lessons from the West in past wars suggested to me that in the
weeks to come America would, in fact, prove quite effective against the
terrorists. Nothing that has transpired since then has convinced me that
I was wrong in that initial prognosis. Months later, pace Black,
I still see no reason why Islamic fundamentalist terror cannot go the way
of Nazism and Japanese militarism—if we display, as did our predecessors,
a similar determination to fight tirelessly against such fascism and the
evil it represents.
Black
goes on to make the astonishing claim that “civic militarism is not a trademark
of Western militaries.” In fact, civic militarism originated exclusively
in Greece and Rome, and its traditions remain strong in the West today.
That a modern America of 300 million does not have conscription—with a
youthful cohort perhaps of, say, 30 million under arms—means little. American
soldiers in fact freely enlist with clear contractual rights and responsibilities,
with legal counsel, rights of appeal of court action, and general conditions
of service subject to civilian audit and legislative oversight. Soldiers
in Western Europe and America enjoy a privileged legal status that reflects
their standing as free citizens. In that regard, they are a world away
from unfree draftees in Iraq, China, and North Korea, whose societies have
no connection with the unique Western legacy of civic militarism.
Black
suggests the “failure of European imperial powers in the 20th century”
disproves my general assertion of long-term Western military superiority;
but then he strangely refutes his own point by immediately citing those
who “suffered from Western imperialism”—an imperialism which apparently
was successfully imposed largely by arms? He adds that those who suffered
from colonialism “may also be surprised to know that constitutional government
is a foundation of Western culture.” But imperialism and Western constitutional
government—whether in ancient Athens, republican Rome, 16th-century Venice,
or 19th-century Britain—are not necessarily antithetical. My point was
hardly that constitutional government was always as utopian as, say, the
present-day EU—only that it encouraged military dynamism in a way that
others types of rule often could not. The British had no moral right to
be in Zululand, but their parliamentary system of government—in addition
to a variety of other institutions and cultural values—ensured that their
soldiers could be in Zululand in a manner impossible for Cetshwayo’s impis
to reach London.
Black
is equally mistaken in matters of detail and emphasis. The Ottomans survived
the aftermath of Lepanto not because of Western impotence, but rather because
of immediate squabbling among the powerful and victorious Europeans—and
the neutrality of England, Portugal, France, and the Dutch. The latter
were all busy with the exploration of the Americas, Africa, and the Orient—if
occasionally, along with the Venetians, siding with the Ottomans against
the Mediterranean Europeans. Again, Black confuses the issues of morality
and dynamism: Lepanto is not a case of Western idealism or even morality,
but rather another example of how capitalism, constitutional government,
notions of Western discipline, and traditions of individualism allowed
just a few European states to defeat an armada of an empire of millions.
At the battle, just three European states destroyed most of the Ottoman
navy—whose ships and guns were mostly patterned after Venetian designs,
and whose crews were often commanded by renegade Europeans. Yes, the Ottoman
navy was rebuilt after Lepanto—but largely by copying the Arsenal of Venice
and importing European designers and gunsmiths.
Thousands
of Totonacs and later Tlaxcalans joined Cortés in his attacks on
Tenochtitlan, as Black notes. But what was so significant about that? In
previous wars dozens of tribes had been fighting the Aztecs for decades
without much success. But (miraculously?) between 1519-21 their efforts
abruptly led to the absolute destruction of their oppressors—that is, precisely
when a few hundred Spaniards landed, armed themselves with harquebuses,
crossbows, horses, cannon, plate armor, and brigantines, and then slaughtered
thousands of their enemies without themselves perishing.
Also
disturbing is Black’s seeming embrace of cultural relativism in raising
the old specter of terrorism being merely a relative term. Unlike Black,
most Americans can distinguish well enough between terrorists who seek
to destroy legitimate elected governments and those freedom fighters who
wish to liberate the unfree from autocracy—such as Black’s unintentional
example of the French patriots who fought for freedom against the Nazi-supported
Vichy government. And yes, there would be, after all, a moral difference
between seeking to overthrow the Stalinist Soviet Union that was responsible
for the murders of innocent millions and battling the present-day Russian
government that has both a freely-elected parliament and president.
Perhaps
most regrettable is Black’s continuous patronizing of Americans (“it is
difficult for Americans to appreciate the fundamental differences between
American views on, for instance, the Middle East, and those of their allies”).
Pace Black, it is not difficult at all. Americans are only too aware
of such differences. And so we are increasingly disturbed at the European
response—whether characterized by the failure of NATO allies for weeks
to invoke Article V after America suffered its greatest attack on its homeland
in history, or Europe’s continual trade with Iraq, or the EU’s failure
to spend sufficient funds to ensure its own safety, or the collective amnesia
about everything from the liberation of Paris to the Berlin Airlift.
Quite
simply, the problem in the present crisis is not a lack of information
over here that requires tutorials from our more sophisticated European
cousins (“… the advantages of politicians with Oxbridge degrees”). Nor
do we need reminders after attacks in Lebanon, Saudi Arabia, the Sudan,
Somalia, Yemen, and the first World Trade Center bombing that “Americans
are being introduced to the real world.” Rather, September 11 has shown
that the disturbing divide arises from more fundamental questions of morality
between America and an increasingly bureaucratic and not-so-democratic
EU. Far from wanting Europe to “march in step,” we are instead discovering
that Kosovars and Bosnians will be slaughtered in the midst of Europe if
America does not intervene, that the fact of Israeli democracy, free speech,
and religious tolerance means little in the realpolitik of European
thinking when it is a question of oil and trade, and that our allies are
more likely to voice concern over turbans and Froot Loopsâ
in Cuba than to express outrage when American captives in Afghanistan and
Pakistan are brutally executed by their captors. Black is correct to remind
us of the precarious nature of past alliances; but most Americans seem
to be discovering more affinity with democratic, multiracial, and nuclear
India and Russia than they have so far with their erstwhile friends in
NATO.
Recent
disclosures about the misuse of historical evidence and the violation of
academic protocols on the part of prominent American historians remind
us all that as stewards of the past we have a duty to remain honest and
disinterested scholars. But just as importantly, the events of September
11 also require from us intellectual honesty, analyses, and interpretations
that can sometimes prove unpopular to our colleagues in the profession,
and yes, on occasion the employment of moral judgment—rather than misrepresentation
of the views of others, retreats into fashionable relativism, tired warnings
about American naiveté, and academic equivocation that seeks to
appear sober and judicious without being either.
Victor
Davis Hanson
California
State University, Fresno
Jeremy
Black replies:
I
am not sure if it will be labeled as misrepresentation, relativism, or
equivocation, but let’s start with the consequences of Lepanto. It was
not followed by the recapture of Cyprus, let alone, as Don John hoped,
by the liberation of the Holy Land. An attempt to retake Modon failed in
1572. The following year, Venice agreed to peace terms with the Ottomans,
recognizing the loss of Cyprus. Unable to respond to the Ottoman capture
of Tunis in 1574, Spain followed with a truce in 1578. Lepanto was arguably
more decisive for what it prevented—a possible resumption of the Ottoman
advance—than for pushing the balance of military advantage toward the Christians,
although the move towards war with Persia may be regarded as even more
important.
Hanson
denigrates those with different opinions, but his self-proclaimed stewardship
is too extreme for my tastes. Let me assure him there is nothing “national”
in this: there are many examples of crude and simplistic rants from writers
on this side of the Atlantic. As I have covered many of these points under
debate in my War in the New Century (Continuum, 2001), I do not see much
point in doing more than underlining the need to consider carefully a whole
host of issues and factors Hanson cuts his way through with his fiery prose
and what I consider to be simplistic analysis. He appears to wish to place
me with the European Union and the left, but people rarely fit into boxes
that conveniently. I’m actually a paid-up member of the Conservative Party
and a Euro-sceptic.
I
indeed hope that the terrorists will be unsuccessful, and am encouraged
by the willingness of many American political and military leaders to appreciate
the complexities of the Islamic world. A minor point on which to conclude:
impatience with others is a natural product of crises to which there are
different responses. I am less clear that this is helpful for scholars.
I do not doubt American resolve in World War II because I can understand
why the American government and public chose not to “fight tirelessly against
such fascism” prior to December 1941. Perhaps if Hanson chose to probe
the complexities of other societies, he might find it instructive. But
I fear that this will simply be dismissed as relativism. However it is
described, such a process is important to scholars, and, indeed, to patriots
if they wish to offer effective advice.
Jeremy
Black
University
of Exeter