When the academy was the exclusive playground of white men, it produced the theories of race, gender, and Western cultural superiority that underwrote imperialism abroad and inequality at home. In recent decades, women and people of color have been critical to producing new knowledge breaking down those long-dominant narratives. Sociological research confirms that greater diversity improves scholarship.
Yet the struggle to diversify the academy remains an uphill battle; institutional biases are deeply ingrained, and change evokes nostalgia for times past. Both of these obstacles were fully in evidence at a recent Applied History conference at the Hoover Institution at Stanford University. Although history is a discipline with a growing number of nonwhite faculty members, and a healthy percentage of female scholars—indeed, women constitute more than a third of the faculty in Stanford’s own history department, across the bike lane from the Hoover Institution—the Hoover conference was made up of 30 white men (and one woman, who chaired a panel). These white men gathered to discuss the supposed fact that the “majority of academic historians have tended to shy away from questions of contemporary interest, especially to policy makers.”Previous generations were less shy of such questions,” the conference website claimed.
Has the current generation of historians in fact abdicated its responsibility to consider questions of contemporary interest? Most historians would find this claim silly; history is always about questions of contemporary interest, always “applied.” So how has the new, more diverse generation of historians produced work with policy implications?
The Harvard historian Caroline Elkins won a Pulitzer Prize in 2006 for fully exposing the violence of British decolonization in Kenya, puncturing longstanding myths about peaceful withdrawal. Her work has resulted in successful civil lawsuits against the British government by Kenyan survivors.
Catherine Hall of University College London chairs a group of historians assembling a database of British slave owners. In showing how slave ownership has skewed racial and class relations in Britain for centuries, their work opens up a range of international and domestic policy possibilities for righting historical wrongs.
Stanford’s first president, David Starr Jordan, was a key promoter of eugenicist theories of race. Now, Allyson Hobbs, an African-American historian at Stanford, has written an award-winning history of racial passing showing both the constructed nature of racial identity and the arbitrary nature of racial laws—with implications for policies about social identification and race today.
Another Stanford historian, Ana Raquel Minian, who grew up in Mexico, has utterly punctured myths about welfare-scrounging Mexican immigration to the United States in the 20th century—a burning political issue with pressing policy implications right now, which Professor Minian has discussed in various public venues.
Also at Stanford, the eminent historian of science and colonialism Londa Schiebinger has led international governmental efforts to address the fact that medical treatments and other technologies developed without attention to differing effects on men and women have historically posed enormous health risks—and market costs.
These are just a handful of examples from my immediate field and home department. All over the academy, historians are producing work relevant to policy and easily accessible to policy makers. Many work hard to share their work with the public.
The problem is not that historians don’t produce policy-relevant research but that their work tends to cast a critical light on the current political order, and policy makers therefore often willfully ignore it.
Many historians have shown that the Second Amendment was about the right to arms for military, not civilian, purposes, but policy makers like Sen. Marco Rubio of Florida ignore that research. Indeed, a group of distinguished historians including Lois Schwoerer and Jack Rakove filed an amicus brief on the Second Amendment during the landmark 2008 D.C. v. Heller case, but Justice Antonin Scalia proved impervious to it. The resulting decision held up a dangerously expansive—and historically inaccurate—understanding of the amendment.
Historians also warned us about the dangers of the Iraq War. In particular, the Middle East scholar Juan Cole, from the University of Michigan at Ann Arbor, acquired an enormous following through his blog by laying out the case against the war. But such discouraging views were not heeded by an administration so bent on war that it not only ignored history but faked it—cooking up a fable of weapons of mass destruction.
Given the proclivities of policy makers, the historian’s real role is, in fact, to speak to the public, so that people may exert pressure on their elected representatives.
This idea is itself born of the imperial past. When a British missionary in India named Edward Thompson joined British forces in an earlier invasion of Iraq—as an army chaplain during World War I—the experience disillusioned him profoundly. He sought to atone by correcting the British public’s understanding of the colonial enterprise. So, this white man wrote a history of the massive Indian Rebellion of 1857, which Britons had long portrayed as a diabolical attack on an entirely benevolent British presence. His account acknowledged the real political protest the rebels expressed and the British violence that provoked their own. It was 1925, and Thompson’s book became part of public debate about the increasingly powerful Indian nationalist movement.
Thompson developed a passionate faith in the historian’s craft as the most effective means of truth-telling against the state. His son, the historian and political activist E.P. Thompson, grew up “expecting governments to be mendacious and imperialist and expecting that one’s stance ought to be hostile to government.” He looked to the historic libertarian tradition of working-class people for ways to check the excesses of the “secret state” in the Cold War era that shaped his life. He realized that modern democracy, simply by virtue of its insistent demand for openness, tends to foster an almost paranoid official secrecy and that the historian is the archetype of the active citizen. Thus emerged our 20th-century understanding of the historian as a critic of government.
Of course, there are many other sources of the idea of the historian-as-critic; I offer this “great white man” version ironically. E.P. himself pushed back against the “Great Man” version of history, encouraging a new trend of writing “history from below.” In 1988, Joan Wallach Scott, professor emerita at the Institute for Advanced Study, pointed out the gendered nature of his work, the way working-class men stood in for both men and women. Another field was born. And so on, inclusivity breeding inclusivity, by degrees, in fits and starts.
To be sure, historians have, in some ways, ceded our claim to policy expertise to other kinds of scholars: economists, political scientists, sociologists. This is partly the result of new dogmas equating expertise and quantitative analysis. It is also part of an intrinsically antihistorical, universalist approach to understanding political change, which imagines that what worked in Country A will work in Country B regardless of history and context—that, for instance, the forces that drove the industrial revolution in 18th-century Britain will do the same in a different place centuries later, or that, since the defeat of Hitler gave rise to a liberal democracy in West Germany, removal of dictators will always and necessarily do so. Recent history is littered with evidence of the folly of such logic.
Historical interpretation is crucial to contemporary issues like gun control, immigration, and the “war on terror.” Historians must continue to assert their expertise on such matters against the monopolistic claims of social scientists—and against those who would prefer a cloistered group of white men to remonopolize that role.