The Purloined Mainframe: Hiding the
History of Computing in Plain Sight
John Laprise
Northwestern University
Editor: Nathan Ensmenger
I have heard many times from my colleagues that I am
lucky to be involved in such an edgy, greenfield area as
the history of computing. We are unconstrained by the
shackles of decades of prior scholarship and find our-
selves with an abundance of relevant, important topic
areas for which there is high demand. We are fortunate
that our focus lies on a moving technological area that
constantly presents us with new material for research.
Moreover, the history of computing is a foundational
field for even newer, hotter fields of inquiry such as
the Internet and ‘‘new media’’ studies. I smile and nod.
Still, I have also heard said that abundance and a
lack of restraint come with their own challenges. This
is nowhere more apparent than when we look at how
the US federal government uses computers. I am not
talking about the frequently cited narratives of Paul
Edwards, Thomas Hughes, and James Beninger
1
nor
the influence of the Department of Defense on early in-
novation initiatives such as Arpanet.
2
I am referring to a
narrative like Jon Agar’s excellent work chronicling the
use of computers by the British government.
3
Scholars
in the US have ignored Agar’s lead and avoided negoti-
ating the labyrinthine federal records system. To date,
scholarship on the US government’s use of computers
has been broad and general, as in the early economic
work of Kenneth Flamm and the recent work of James
Cortada.
4
The US federal government is immense. Executive
branch departments are the size of large multinational
corporations, with similar reach and power. Yet histor-
ians of computing looking at government have largely
focused on elements in the Department of Defense not
related to cryptography and NASA. Both of these organ-
izations have a well-documented interest in computer
technology. What about everyone else?
Computers in the White House
One would suppose that the White House might
have been somewhat influential in the evolution of
computers since the federal bureaucracy was IBM’s larg-
est client prior to 1970. Imagine my surprise four years
ago when I discovered a proverbial ‘‘gap in the litera-
ture.’’ A discovery like this is every doctoral students’
dream, and as I came to realize, something of a night-
mare. I started out small, just trying to trace the history
of one innocuous agency within the executive office of
the President (EOP), the Office of Telecommunications
Policy (OTP), which had the unusual early mandate to
make federal computer policy. This organization’s ad-
ministrative history led me into the administrative
records of the Department of Commerce (DoC). In the
meantime, I stumbled into the records for something
called ‘‘information automation’’ within the records of
the Secretariat of the National Security Council (NSC).
To my great surprise, there they were: the documen-
tary record of the very first computer system adopted by
the White House including technical specifications,
software manuals, and memos from White House offi-
cials, the RAND corporation, the DoD’s Defense Com-
munications Agency (DCA), and the White House
Communications Agency (WHCA), most of them lying
in a series of dusty folders in a box from the NSC admin-
istrative records that had never been classified. The very
helpful archivist at the National Archives remarked to
me that, according to his records, some of these boxes
had never ever been checked out. Needless to say,
I put the OTP on the back burner and refocused.
In the following year, I expanded my focus to un-
cover the story of White House computing during the
1970s. I was constantly surprised by the availability
of records regarding what I considered sensitive topics
such as telecommunications security during the Ford
administration and the technological debacle that
was the Carter White House. I learned to make sense
of the alphabet soup of agencies that comprise the fed-
eral government. I was even fortunate enough to collect
an oral history from the White House’s first computer
project manager. I then found myself in an academic
sinkhole.
My colleagues were very excited by my research, but
the persistent and omnipresent critique of my work was
the absence of a sufficient literature review. Many schol-
ars in established fields have the reverse problem—an
abundance of previous scholarship with a dearth of
new material. I found myself in the opposite position.
After much wailing and gnashing of teeth, I came to a so-
lution. I invoked multiple and sometimes unconven-
tional literatures to show how each is in turn applicable
and enriched by my research.
The problem with greenfields . . .
As historians of computing, we have a two-fold prob-
lem. First is the aforementioned lack of restraint. All of
Think Piece
continued on p. 83
84
IEEE Annals of the History of Computing
1058-6180/09/$26.00
c
2009 IEEE
Published by the IEEE Computer Society
Authorized licensed use limited to: KINGS COLLEGE LONDON. Downloaded on November 30, 2009 at 10:28 from IEEE Xplore. Restrictions apply.
us recognize the debt we owe to previous
scholars and a solid literature review illumi-
nates those links. All too frequently, we find
that our research exceeds the scope of earlier
work, especially in our own field. Our field
of inquiry encompasses an ever-expanding
field of source material that sometimes
imperfectly fits into existing scholarship.
One solution is to ‘‘cut to fit,’’ wherein we
constrain the narrative to fit the literature.
Although this works well when preparing
articles for journals based in particular fields,
it impoverishes the histories that we uncover.
The alternative is to engage a broader
range of literature in an effort to better ex-
plain the nuances of our historical narratives.
Complicating this solution is an ever-present,
but weakening tendency against interdisci-
plinary research. Having come from a thor-
oughly interdisciplinary background and
having joined a program that further encour-
aged it, I did not find this tendency pro-
nounced. So when I introduced literature
on presidential administrative studies into
my work, it was well received. The same can-
not be said for many of my colleagues who
face demands for disciplinarity. This then
is a call for interdisciplinary leniency from a
junior scholar who has experience with its
value and importance.
The second problem is abundance. There
are many worthy narratives that can illumi-
nate our understanding of the history of
computers. We access these narratives with
varying levels of difficulty. As historians
with an abundance of source material, we
should not mistake the easy for the worthy.
Federal computing history is a case in point.
There are many unwritten histories contain-
ing content that can only be revealed
through hard historical research and that
might lead us to unconventional answers.
Federal records are not always easy to deci-
pher, but the lack of research focusing on
how various elements of the government
came to adopt and use computers is astound-
ing when the documentary record is readily
available.
After one of my presentations, a noted col-
league questioned my work, wondering why
the White House computer system’s utiliza-
tion numbers tended to be under 50% when
processing time in the 1970s was a scarce
and valuable resource. I had failed to bring
out in my presentation that as a national se-
curity resource linking the US President to
the armed forces and their nuclear weapons
the computer system required redundancy.
Therefore, the system’s ‘‘partial’’ utilization,
which would have been unthinkable to a
company using an expensive mainframe
computer, was simply routine best practice
in a national security environment.
Fading resources
This brings me to my final point. We
should not take for granted our abundance.
Part of the richness of our history is our abil-
ity to conduct oral histories with the people
we read about in documents. I was fortunate
to conduct one but lost a second opportunity.
I was in the process of setting up interviews
with Clay Whitehead, the first head of the
OTP, when he lost his fight with cancer.
Human mortality diminishes our abundance
on a daily basis. The barriers preventing us
from going out and interviewing the people
we talk about in our work are feeble.
We are fortunate to be in a branch of his-
tory that focuses on a period beginning with
World War II. Our subjects are changing and
evolving, offering us rare academic opportu-
nities for research. Rather than take these
opportunities for granted, we should take ad-
vantage of our richness to share our findings
outside our historical boundaries and take
full advantage of the richness of our chosen
field.
Acknowledgments
I acknowledge historians of cryptography and
their unique struggles to bring to light the in-
telligence community’s use of computers,
including David Kahn, James Bamford, and
William Burroughs. They are all too infre-
quently cited.
References
1. J. Beninger, The Control Revolution, Harvard
Univ. Press, 1986; P.N. Edwards, The Closed
World, MIT Press, 1996; and T.P. Hughes,
Rescuing Prometheus, Vintage, 2000.
2. J. Abbate, Inventing the Internet, MIT Press,
2000.
3. J. Agar, The Government Machine, MIT Press,
2003.
4. K. Flamm, Creating the Computer, The Brookings
Institution, 1988, and J. Cortada, The Digital
Hand, vol 3, Oxford Univ. Press, 2008.
Readers may contact John Laprise about this article
at j-laprise@northwestern.edu.
Contact department editor Nathan Ensmenger at
annals-thinkpiece@computer.org.
continued from p. 84
July–September 2009
83
Authorized licensed use limited to: KINGS COLLEGE LONDON. Downloaded on November 30, 2009 at 10:28 from IEEE Xplore. Restrictions apply.