Interfaces

Essays and Reviews in Computing and Culture 

Interfaces publishes short essay articles and essay reviews connecting the history of computing/IT studies with contemporary social, cultural, political, economic, or environmental issues. It seeks to be an interface between disciplines, and between academics and broader audiences. 

2021, Volume 2

Editors: Jeffrey R. Yost and Amanda Wick

 

Before the Byte, There Was the Word: The Computer Word and Its Many Histories

Johannah Rodgers

Abstract: Tracing and documenting the genealogies of what, in the twentieth century, will become known as “the computer word,” this article explores the importance of the term to the histories and presents of digital computings, the technical and rhetorical functions of verbal language involved with its emergence in the mid-twentieth century U.S., and the import of term’s currency in discourse networks forged across industry, government sponsored university research initiatives, and popular media.

PDF version available for download.

Defining.Words.IEEE.Glossary 1
Illustration 1

What We Know

Unlike the terms bits and bytes, the computer word, which is defined by the IEEE as "a unit of storage, typically a set of bits, that is suitable for processing by a given computer," (Illustration 1) has not yet become part of popular discourse. Instead the term remains a technical one, familiar to every computer scientist and technician but not to the average consumer. Also unlike the terms bits and bytes, the origins of which have become part of the print record (bits is said to date from a January 9, 1947 Bell Labs memo drafted by John W. Tukey and byte from a June 11, 1956 IBM memo drafted by Werner Bucholz) (Tropp), those surrounding the computer word have not been well documented in either the histories of computings or fields related to it, including writing and media studies. Delving into the histories of computings archive, it is possible to identify a narrow time frame in which the term begins its emergence, sometime between late spring 1945, when John von Neumann drafts his notes that will later be referred to as the "First Draft of the EDVAC Report" and September, 1945, when J. Presper Eckert, John Mauchly, et al., compile their report entitled "Automatic High Speed Computing: A Progress Report on the EDVAC." Yet, the story of the "computer word" is, like many in the histories of computings, neither a classic origin story nor one with a sole author/inventor or single conclusion. Rather, it is collaboratively authored, recursive in its structure, and has implications that, I believe, are only beginning to be fully explored. 

Every electronic computational machine since the ENIAC, the first fully electronic computing project in the U.S., has been described as having a "word size" and as containing a certain number of “words.” Acting as an interface between hardware and what will later become known as software, the computer word becomes one of the building blocks for machine and programming languages. It is one part of the process that enables hardware and the instructions controlling it to communicate and "understand" one another. Technically, choosing a computer word's "word size" is one of the earliest steps in chip design and, metaphorically, the computer word can be said to function as a word does in "telementational models" (Harris) of human to human communication: it allows for information to be transmitted and exchanged according to a "standard" meaning. While, in human communication, verbal words, i.e., those spoken or inscribed by humans, rarely maintain a fixed meaning, in machine communication, the computer word and the bytes and bits that will later be said to compose it have been, over time, made to adhere to such standards. Figuratively, if bits can be said to be the millimeters of digital electronic computers and bytes the centimeters, the computer word can be said to function as the meter.

Electronic Discrete Variable Automatic Computer
Illustration 2: Two women operating the ENIAC's main control panel while the machine was still located at the Moore School. "U.S. Army Photo" from the archives of the ARL Technical Library. Left: Betty Jennings (Mrs. Bartik) Right: Frances Bilas (Mrs. Spence).

Despite the technical significance of the computer word to the historic and current functions of digital electronic computers, documenting its histories is, for several reasons, anything but straightforward, in part because of the complexities of the EDVAC project itself, in part because of the later depictions of the EDVAC project in relation to projects predating it, and in part because of issues related to the histories of computings archives. Unlike the ENIAC, the EDVAC project unfolded during a time of transition from war-time to post-war based funding priorities for the U.S. military and from university-focused to industry-focused research and development initiatives. As a result, it is one that continues to provide scholars with a wealth of material and issues (technical, economic, political, and socio-cultural) to consider in relation to other electronic and electro-mechanical computing projects in the United States. The EDVAC project was, as Michael Williams has clearly documented, a fraught one and produced a machine that may actually have only been operational for a very short time and differed considerably from initial design documents. Further, recent research related to the ENIAC project by Haigh, Priestley and others, has emphasized the similarities rather than the differences between the ENIAC and EDVAC projects and called into question the portrayal of the "Von Neumann" architecture as the invention of von Neumann or a clear departure from the architectures of earlier "computing" projects in the U.S.

The availability, accessibility, and reliability of documentary archival materials also all play roles in how the (hi)stories of the computer word can be told. Just to point to two selected examples related to my research for this project, the digital copy of Eckert and Mauchly's "Automatic High Speed Computing" report available in the archive of the Museum of the History of Computing is an excerpt of the complete report. While this particular copy is valuable to researchers since it is from the archive of Donald Knuth and contains his notes, at present, no complete digital copy of the report exists that is publicly accessible. It was, in fact, only through the very generous support and assistance of the University of Pennsylvania Libraries Special Collections that I was able to remotely access a digitized copy. The existence of as yet uncatalogued materials raises other issues unique to the histories of computings archives. As a result of his ongoing research involving the Goldstine papers at the American Philosophical Association archive, Mark Priestly has drawn attention to manuscripts and unpublished lecture notes with significant implications for how not only specific terms, including the computer word, are interpreted but to other topics, such as how Turing's work may have been used by von Neumann.   

EDVAC.Report.Footnote 2
Illustration 3: EDVAC Report Footnote.

While the paper trail documenting this term "computer word" will be for some time still unfolding, what we do have currently are paper traces documenting its evolution from a term with several different functions as a rhetorical device to a technical term and finally to a technical standard. The September, 1945 "Progress Report on the EDVAC" appears to be the first time that the term “word” is used in an official document and proposal. Attributed to von Neumann in a footnote (Illustration 3), the term "word" (without quotation marks!) is introduced in a manner with more than slight Biblical overtones: "each pulse pattern representing a number or an order will be called a word*" (Eckert, et al.) In the earlier, June, 1945 Draft EDVAC report, von Neumann refers to the unit that will later be referred to as a “word” as a “code word," a term that references the operations of telegraphic machines (Priestley) and also likely the “codes” contained on the punched cards used to feed program instructions to early automatic calculators, including the ENIAC and the Mark I. Although there are references to a/the "word," "words," and specific types of "words," i.e., logical words, machine words, instruction words, control words, in documents throughout the 1950s, the earliest mention of the term "computer word" in all likelihood appears later, around 1960 (Stibitz, COBOL Report).

What We Are Still Learning

These findings reveal some useful insights into both what we know and to what we are still in the process of learning about the histories of computings and the roles of verbal language, linguistics, and language education in them.

As Nofre, et al., emphasize in their 2014 article "When Technology Became Language," the mid-1940s represent an important inflection point in how electronic digital computers are being conceived and discussed as "understanding" and engaging with language. The introduction of the term "word" to describe the operations of the EDVAC architecture appears to be one part of the discursive and technical transformation of high speed automatic calculators to general purpose digital electronic computers and to language processing (if not yet language possessing) machines. One of the key differences between the ENIAC and the EDVAC was the addition of new types of electronic storage media and its use for not only storing but manipulating codes “internally” to instruct the machine (Burks). One part of the external memory in the ENIAC, as Eckert explains it in his first Moore lecture, was “the human brain” (116). The ENIAC was a decimal-based calculator and required significant input from skilled human operators in order to function. Both issues are noteworthy because part of the story of this metaphor of the word has to do with its emergence at the same time that there is a moving away from human to machine readable writing systems (decimal to binary) and communication and storage media (wires to pulses; cards, paper, and human brains to short and long tanks and mercury delay lines). In a Turing complete machine, the machine must have some way of representing its own operations; both the ENIAC and the EDVAC had this capability. However, one major difference between these two machines was the manner in which instructions were represented and communicated so the machine could “understand” them. With the ENIAC, wires were used as the system of notation; with the EDVAC, the alphabet became another system of notation (Alt).   

These paper traces from the 1940s also underscore the collaborative environment in which military funded research projects were being developed and documented in the U.S. While I am not suggesting with this term "collaborative" that in such an environment everyone was acting communally or even getting along while they worked together, I am arguing that fluidity and responsive improvisation are evident both in the discourse and the technical systems being described. What this means to documenting the genealogy of the computer word is that if the neologism is to be attributed to von Neumann, it would be necessary to put quotation marks around the terms "computer word" and "von Neumann" to indicate that both are names applied retroactively to fix the meanings of phenomena that were still emergent when placed in their specific historical contexts. Neither the "computer word" nor what is now still frequently referred to as the "Von Neumann Architecture" emerges fully formed as a concept or technical standard in 1945 when von Neumann drafted his notes for the EDVAC project in Los Alamos, New Mexico, referred to the unit that will be used to communicate data and instructions in the EDVAC project as a "code word," and handed his notes to one or more secretaries to type up and possibly reproduce and circulate (Williams).

The absence of a single author or origin story for terms like the computer word reinforces the importance of analyzing the rhetorical contexts in which the word choices of von Neumann/"von Neumann" and others are made, as well as the processes of exchange and circulation of these terms (Nofre, Martin). Paul Ceruzzi's recent article in this journal about the myths of immateriality surrounding the natural resource intensive reality of "cloud"-based computing is one example of the power of names to shape discourse and its receptions (Ceruzzi). Another example relates to the complexities involved in attempts by Tropp to depict the emergence of the term "bit" as a story with a single author. As the multilayered documentation presented in Tropp's article makes clear, there were many contributors to the creation of the neologism bit, which become metaphorically and technically imbricated in human discourse and the engineering projects they inform and describe. The term "bit," which was originally named an "alternative" by Shannon, was deemed at one point a "bigit," and, while, Tukey's memo may have been the first document we currently have access to in which the term appears, even a cursory reading of it reveals that drawing from it the conclusion that Tukey "coined" the term is far from certain (Tropp).

EDVAC.Report.ENIAC.EDVAC.Comparison 3
Illustration 4: EDVAC Report ENIAC-EDVAC Comparison

What's In a Word?: Teaching Machines to Read and Write

After 1946, the "signifying operations" (Rodgers) of terms related to and involving language as applied to digital electronic computers continue to widen in the technical, industrial, and popular literature (Nofre, Martin). Part of my interest in this relates to the fact that what we now call the early history of digital electronic computing, calculating machines are constructed based on models of the human, which are then explained via metaphors to influence decisions being made regarding the funding of educational and work initiatives for human computers and electronic computers based on the costs and interchangeability of the two (Grier). As Grace Hopper will point out in her 1952 article "The Education of a Computer," the EDVAC architecture is identical to a schematized cognitive model of a writing/calculating subject. In this context, the date and location of the drafting of the EDVAC Notes and later report are both significant considering their purpose and later reception, as are the later technical decisions that placed the affordances and performance standards of machinic operability over those of human legibility in the EDVAC project (Illustration 4). Yet, it is in part through verbal logic and the deployment of rhetoric that decisions were made regarding whose and what logics and languages would become "hard wired" into digital electronic computing machines.

Depiction of Human Brain as Instrument for Creation, Storage, and Exchange of Word Images
Illustration 5:  Depiction of Human Brain as Instrument for Creation, Storage, and Exchange of Word Images (Starr, 1895)

Somewhat ironically, though, perhaps, inevitably, it is with issues of representation and the roles of spoken and inscribed language in depicting and constructing realities and histories that the intrigue involved with interpreting relationships between alphabetic words and computer words really begins. While it is impossible to know the exact reasons for a specific word choice, it is possible to consider the rhetorical contexts in which the EDVAC report was written. The term "word" functioned both to signal what was new about the project while also performing some explanatory work to an audience deciding the fate of the EDVAC funding proposals. Yet the target domain of this metaphor is human language processing, which it is presumed, rather than proved, that the proposed technical system will replicate (Harris). In giving the EDVAC calculating machine the ability to "instruct" itself with the metaphor of the word, binary arithmetic calculation is paired with alphabetic communication in a way that has implications for the processes involved with both and based on assumptions about how language functions, what the purposes of communication are, and for the benefit of specific parties and interests (Dick). From a writing studies perspective, the word choice of the "code word" and "word" connect the mid-1940s with the instrumentalization of writing and human writing subjects that had been occurring throughout the late nineteenth and early twentieth centuries (Gitelman, Rodgers) and to early discussions of "AI" and the roles and histories of writing, logic, and language education policies embedded in them (Kay, De Mol, Heyck). 

Acknowledgments

I am grateful to Joseph Tabbi, Cara Murray, and Robert Landon for their comments and suggestions related to earlier drafts of this article.  Thank you also to Jeffrey Yost for his insightful suggestions, and to Amanda Wick and Melissa Dargay for their work and contributions. Holly Mengel, and David Azzolina for their assistance in remotely accessing a digital copy of  "Automatic High Speed Computing: A Progress Report on the EDVAC," and to Donald Breckenridge for his practical, editorial, and emotional support. Finally, a special thanks to John H. Pollack, Curator, Kislak Center for Special Collections, Rare Books and Manuscripts at the University of Pennsylvania Libraries, and his colleagues Charles Cobine, Eric Dillalogue.

 


Bibliography

Alt, Franz. (July 1972). "Archaeology of Computers: Reminiscences, 1945-1947." Communications of the ACM, vol. 15, no. 7, pp. 693–694. https://doi.org/10.1145/361454.361528.

Burks, Arthur W. (1978). "From ENIAC to the Stored-Program Computer: Two Revolutions in Computers." Logic of Computers Group, Technical Report No. 210. https://deepblue.lib.umich.edu/handle/2027.42/3961.

Burks, Arthur W., Herman H. Goldstine, and John von Neumann. (28 June 1946). Preliminary Discussion of the Logical Design of an Electronic Computer Instrument. Institute for Advanced Study. https://library.ias.edu/files/Prelim_Disc_Logical_Design.pdf

Campbell-Kelly, M. and Williams, M. R., eds. (1985). The Moore School Lectures: Theory and Techniques for Design of Electronic Digital Computers, volume 9 of Charles Babbage Institute Reprint Series for the History of Computing. MIT P. https://archive.org/details/mooreschoollectu0000unse.

Ceruzzi, Paul E. (2021). "The Cloud, the Civil War, and the 'War on Coal.' Interfaces: Essays and Reviews in Computing and Culture, Charles Babbage Institute, University of Minnesota. https://cse.umn.edu/cbi/interfaces.

De Mol, Liesbeth and Giuseppe Primiero. (2015). "When Logic Meets Engineering:

Introduction to Logical Issues in the History and Philosophy of Computer Science." History and Philosophy of Logic, vol. 36, no. 3, pp. 195-204. https://doi.org/10.1080/01445340.2015.1084183.

Dick, Stephanie. (April–June 2013). "Machines Who Write." IEEE Annals of the History of Computing, Vol. 35, No. 2, pp. 88-87. https://doi.org/10.1109/MAHC.2013.21.

Eckert, J. P. (1985). "A Preview of a Digital Computing Machine." The Moore School Lectures: Theory and Techniques for Design of Electronic Digital Computers, volume 9 of Charles Babbage Institute Reprint Series for the History of Computing, edited by M. Campbell-Kelly and M.R. Williams, MIT P. https://archive.org/details/mooreschoollectu0000unsepp. 109-128.

Eckert, J. P. and Mauchly, J. W. (September 30, 1945). Automatic High Speed Computing: A Progress Report on the EDVAC, Moore School of Electrical Engineering, University of Pennsylvania.

Gitelman, Lisa. (1999). Scripts, Grooves, and Writing Machines: Representing Technology in the Edison Era. Stanford UP.

Heyck, Hunter. (2014). “The Organizational Revolution and the Human Sciences.” Isis, vol. 105, no. 1, pp. 1–31. https://www.journals.uchicago.edu/doi/10.1086/675549.

Harris, Roy. (1987). The Language Machine. Duckworth.

Haigh, Thomas, Mark Priestley, and Crispin Rope. (January-March 2014). "Reconsidering the Stored-Program Concept." IEEE Annals of the History of Computing, vol. 36, no. 1, pp. 40-75. https://doi.org/10.1109/MAHC.2013.56.

Haigh, Thomas and Mark Priestley. (January 2020). "Von Neumann Thought Turing's Universal Machine was 'Simple and Neat.': But That Didn't Tell Him How to Design a Computer." Communications of the ACM, vol. 63, no. 1, pp. 26-32. https://doi.org/10.1145/3372920.

Hopper, Grace M. “The Education of a Computer.” (1952). Proceedings of the 1952 ACM National Meeting (Pittsburgh), edited by ACM and C. V. L. Smith. ACM Press, pp. 243–49.

IEEE Standards Board. (1995). IEEE Standard Glossary of Computer Hardware Terminology. IEEE. doi: 10.1109/IEEESTD.1995.79522.

Kay, Lily. (2001). "From Logical Neurons to Poetic Embodiments of Mind: Warren S. McCulloch's Project in Neuroscience." Science in Context vol. 14, no. 4, pp. 591-614.

Martin, C. Dianne. (April 1994). "The Myth of the Awesome Thinking Machine." Communications of the ACM, vol. 36, no.4, pp: 120-33.

Nofre, David, Mark Priestley, and Gerard Alberts. (2014). "When Technology Became Language: The Origins of the Linguistic Conception of Computer Programming, 1950–1960." Technology and Culture, vol. 55, no. 1, pp. 40-75.

Priestley, Mark. (2018). Routines of Substitution: John von Neumann’s Work on Software Development, 1945–1948. Springer.

Report to Conference on Data Systems and Languages Including Initial Specifications for a Common Business Oriented Language (COBOL) for Programming Electronic Digital Computers. (April 1960). Department of Defense.

Rodgers, Johannah. (July 2020). "Before the Byte, There Was the Word: Exploring the Provenance and Import of the 'Computer Word' for Humans, for Digital Computers, and for Their Relations." (Un)-Continuity: Electronic Literature Organization Conference, 16-18 July 2020, University of Central Florida, Orlando, FL, USA. https://stars.library.ucf.edu/ elo2020/asynchronous/talks/11/.

Starr, M. Allen. (1895). "Focal Diseases of the Brain." A Textbook of Nervous Diseases by American Authors edited by Francis X. Dercum, Lea Brothers. https://archive.org/details/b21271161

Stibitz, George R. (1948). “The Organization of Large-Scale Computing Machinery.”

Proceedings of a Symposium on Large-Scale Digital Calculating Machinery, Harvard UP, pp. 91–100.

Tropp, Henry S. "Origin of the Term Bit." (April 1984). Annals of the History of Computing, vol. 6, no. 2, pp. 154-55. https://dl.acm.org/doi/10.5555/357447.357455.

von Neumann, J. (June 30, 1945). First Draft of a Report on the EDVAC. Moore School of

Electrical Engineering, University of Pennsylvania. https://history-computer.com/Library/edvac.pdf.

von Neumann, J. (1993). "First Draft of a Report on the EDVAC." IEEE Annals of the History of Computing, Vol. 15, No. 04, pp. 27-75. doi: 10.1109/85.238389.

Williams, Michael R. (1993). "The Origins, Uses, and Fate of the EDVAC." IEEE Annals of the History of Computing, Vol. 15, No. 1, pp. 22-38.

 

Johannah Rodgers (September 2021). “Before the Byte, There Was the Word: The Computer Word and Its Many Histories.” Interfaces: Essays and Reviews on Computing and Culture Vol. 2, Charles Babbage Institute, University of Minnesota, 76-86.


About the author:

Johannah Rodgers is a writer, artist, and educator whose work engages creatively and critically with the histories and presents of print and digital technologies to explore their connections and their roles in the sociologies and economics of literacies in the U.S.  She is the author of Technology: A Reader for Writers  (Oxford University Press, 2014), the Founding Director of the First Year Writing Program at the New York City College of Technology, where she was Associate Professor, and a participant in the 2020-21 University of Cambridge Mellon Sawyer Seminar on the Histories of AI.  You can read more about her projects and publications at www.johannahrodgers.net.

 

 

Early “Frictions” in the Transition towards Cashless Payments

Bernardo Bátiz-Lazo (Northumbria) and Tom R. Buckley (Sheffield)

Abstract: In this article we describe the trials and tribulations in the early stages to introduce cashless retail payments in the USA. We compare efforts by financial service firms and retailers. We then document the ephemeral life of one of these innovations, colloquially known as “Hinky Dinky”. We conclude with a brief reflection on the lessons these historical developments offer to the future of digital payments.

(PDF version available for download.)

Hinky Dinky supermarket ca. 1970s.
A shopper leaving a Hinky Dinky Supermarket. ca. 1970s.

Photo credit: “Supermarkets as S&L Branches,” Banking Vol. 66 (April 1974) pg. 32

Let’s go back to the last quarter of the 20th century. This was a time when high economic growth in the USA that followed the end of World War II was coming to an end, replaced by economic crisis and high inflation. It was a time where cash was king, and close to 23% of Americans worked in manufacturing. A time when the suburbs – to which Americans had increasingly flocked after 1945 escaping city centres – were starting to change. Opportunities for greater mobility were offered by automobiles, commercial airlines, buses, and the extant railway infrastructure.

This was the period that witnessed the dawn of the digital era in the United States, as information and communication technologies began to emerge and grow. The potential of digitalisation provided the context in which an evocative idea, the idea of a cashless society first began to emerge. This idea was associated primarily with the elimination of paper forms of payment (primarily personal checks) and the adoption of computer technology in banking during the mid-1950s (Bátiz-Lazo et al., 2014). Here it is worth noting that, although there is some disagreement as to the exact figure, the volume of paper checks cleared within the U.S. had at least doubled between 1939 and 1955, and the expectation was, that this would continue to rise. This spectacular rise in check volume, with no corresponding increase in the value of deposits, placed a severe strain on the U.S. banking system and lead to a number of industry-specific innovations emerging from the 1950s such as the so-called ERMA and electronic ink characters (Bátiz-Lazo and Wood, 2002).

The concept of the cashless, checkless society became popularised in the press on both sides of the Atlantic in the late 1960s and early 1970s. Very soon the idea grew to include paper money. At the core of this imagined state was the digitalization of payments at the point of sale, a payment method that involved both competition and co-operation between retailers and banks (Maixé-Altés, 2020 and 2021).

Hagley Museum and Archives Philadelphia Savings Fund Society Collection 2062 Box 13 PSFS Online News Bulleting, Vo 80-102, June 23 1980.
Early Point of Sale terminals, ca. 1970s.

Photo credit: Hagley Museum and Archives Philadelphia Savings Fund Society Collection 2062 Box 13 PSFS Online News Bulleting, Vo 80-102, June 23 1980.

In the banking and financial industry new, transformative technologies thus began to be trialled and developed in order to make this a reality (Maixé-Altés, 2019). Financial institutions accepting retail deposits had been at the forefront of the adoption of commercial applications of computer technology (Bátiz-Lazo et al., 2011). Early forms of such technical devices mainly focused on improving “back office” operations and encompassed punch card electromechanical tabulators in the 1920s and 1930s; later, in the 1950s, analogue devices (such as the NCR Post Tronic of 1962) were introduced, and, in the late 1960s the IBM 360 became widely adopted.  But at the same time, regulation curtailed diversification of products and geography (limiting the service banks could provide their customers). These regulatory restrictions help to explain ongoing experiments with a number of devices which involved a significant degree of consumer interaction including credit cards (Stearns, 2011), the use of pneumatic tubes and CCTV in drive through lanes, home banking, and Automated Teller Machines (ATMs), which despite being first introduced in the late 1960s and early 1970s, would ultimately not gain acceptance until the early 1980s (Bátiz-Lazo, 2018).

Like the banking and financial industry, the retail industry, with its very real interest in point of sale digitalization, was exposed to the rise of digital technology in the last quarter of the 20th Century. The digitalisation of retailing occurred later than in other industries in the American economy (for a European account see Maixé-Altés and Castro Balguer, 2015). Once it arrived, however, the adoption of a range of digital technologies including Point of Sale (POS) related innovations such as optical scanning, and the universal product code (UPC), were extensive and transformed the industry (Cortada, 2003). From the perspective of historical investigation, the chronological place of such innovation, beginning in the mid-1970s, is associated with a remarkable period of rapid technological change in U.S. retailing (Basker, 2012; Bucklin 1980). Along with rapid technological change, shifts in the structure of retail markets, in particular the decline of single “mom and pop stores” and the ascent of retail chains also became more pronounced in the 1970s (Jarmin, Klimek and Miranda, 2009). Two decades later, such large, retail firms would account for more than 50% of the total investment in all information technology by U.S. retailers (Doms, Jarmin and Kilmek, 2004). 

What connects the transformative technological changes that occurred in both the banking industry and the retail industry during this period, is that both sought to utilise Electronic Funds Transfer Systems, or EFTS, a way to reduce frictions for retail payments at the point of sale. During the 1970s and 1980s, the term EFTS was used in a number of ways. Somewhat confusingly, it was applied indistinctively to specific devices or ensembles, value exchange networks, and what today we denominate as infrastructures and platforms.  While referring to it as a systems technology for payments it was defined as one:

 “in which the processing and communications necessary to effect economic exchange and the processing and communications necessary for the production and distribution of services incidental to economic exchange are dependent wholly or in large part on the use of electronics” (National Commission on Electronic Funds Transfer, 1977, 1).  

Ultimately EFTS would come to be extended to the point of sale and embodied in terminals which allowed for automatic, digital, seamless transfer of money from the buyer’s current account to the retailer’s, known as the Electronic Funds Transfer at the Point of Sale, or EFTS-POS (Dictionary of Business and Management: 2016). 

One of the factors that initially held back the adoption of early EFTS and the equipment that utilities it, was the lack of infrastructure that would connect the user, the retailer, and the bank (or wherever the user’s funds were stored). As Bátiz-Lazo et al. (2014) note the idea of a cashless economy that would provide this infrastructure was highly appealing… but implementing its actual configuration was highly problematic. Indeed, in contrast to developments in Europe, some lawmakers in Congress considered the idea of sharing infrastructure by banks as a competitive anathema (Sprague, 1977). Large retailers such as Sears had a national presence and were able to consider implementing their own solution to the infrastructure problem. Small banks looked at proposals by the likes of Citibank with scepticism while they feared it may pivot the dominance of large banks. George W. Mitchell (1904-1997), a member of the Board of the Federal Reserve, and management consultant John Diebold (1926-2005), were outspoken promoters of the adoption of cashless solutions but their lobbying of public and private spheres was not always successful. Perhaps the biggest chasm between banks and retailers though, resulted from the capital-intensive nature of the potential network and infrastructure that any form of EFTS required.

 

Early use of an ATM at Dahl’s Foods supermarket in Iowa, circa 1975.
Early use of an ATM at Dahl’s Foods supermarket in Iowa, circa 1975.

Photo credit: Courtesy of Diebold-Wincor Inc.

Amongst the alternative solutions that were trialled by banks and retailers, there were a number of successes, such as ATMs (Bátiz-Lazo, 2018) and credit cards (Ritzer, 2001; Stearns, 2011). Both bankers and retailers were quick to see a potential connection between the machine-readable cards and the rapid spread of new bank-issued credit cards under the new Interbank Association (i.e., the genesis of Mastercard) and the Bankamericard licensing system (i.e., the genesis of Visa), both of which began in 1966, just as the vision of the cashless society was winning acceptance. Surveys from the time indicate that at least 70 percent of bankers believed that credit cards were the first step toward the cashless society and that they were entering that business in order to be prepared for what they saw as an inevitable future (Bátiz-Lazo et al., 2014).

There were also a number of less successful attempts that, far from being relegated to the ignominy of the business archives, offer an important insight into the implementation of a cashless economy which is worth preserving for future generations of managers and scholars. Chief amongst these is a system widely deployed by the alliance of U.S. savings and loans (S&L) with mid-sized retailers under the sobriquet “Hinky Dinky”. Interestingly, Maixé-Altés (2012, 213-214) offers an account of a similar, independent, and contemporary experiment in, a very different context, Spain. The Hinky Dinky moniker was derived from an experiment by the Nebraskan First Federal Savings and Loan Association, which in 1974 located computer terminals into stores of the Hinky Dinky grocery chain - which at its apex operated some 50 stores across Iowa and Nebraska. The Hinky Dinky chain was seen by the First Federal Savings and Loan Association as the perfect retail partner for this experiment owing to the supermarket’s popularity with local customers; an appeal that would be beneficial to this new technology. The popularity of Hinky Dinky was particularly valuable, as the move by First Federal Savings and Loans, to establish an offsite transfer system challenged, but did not break banking law at that time (Ritzer, 1984).

At the heart of the technical EFT system initiated by First Federal, formally known as Transmatic Money Service, was a rudimentary, easy-to-install package featuring a point-of- service machine, with limited accessory equipment in the form of a keypad and magnetic character reader. The terminal housed in a dedicated booth within the store and was operated by store employees (making a further point of the separation between bank and retailer). The terminal enabled the verification and recording of transactions as well as the instant updating of accounts.  The deployment of the terminals in Hinky Dinky stores shocked the financial industry because it made the Nebraska S&L appear to be engaging in banking activities, while the terminals themselves provided banking services to customers in a location that was not a licensed bank branch! 

PSFS Act One Photo creditHagley Museum and Archives Philadelphia Savings Fund Society Collection 2062 Box 13 PSFS Online News Bulleting, Vo 79-127, September 13, 1979.
The Philadelphia Savings Fund Society's negotiable order of withdrawal (NOW) account machine, "Act One."

Photo credit: Hagley Museum and Archives Philadelphia Savings Fund Society Collection 2062 Box 13 PSFS Online News Bulleting, Vo 79-127, September 13, 1979.

From its origins in a mid-sized retail chain in the Midwest, some 160 “Hinky Dinky” networks appeared across the USA between 1974 and 1982, before S&Ls abandoned them in favour of ATMs and credit cards. These deployments included a roll out in 1980 by the largest savings banks by assets in the USA at the time, the Philadelphia Savings Fund Society or the PSFS. Rather than commit to the large capital investment that ATMs necessitated, without guarantees of its viability or a secure return on investment, the PSFS pivoted the “Hinky Dinky” terminals as part of the rolled out of its negotiable order of withdrawal (NOW) accounts (commercialised as “Act One”).

The NOW accounts were launched in the early 1970s by the Consumer Savings Bank, based in Worcester, MA (today part of USBank), as way to circumvent the ban on interest payment and current account deposits imposed on S&Ls by Depression era regulation. Between 1974 and 1980, Congress took incremental steps to allow NOW accounts nationwide, something the PSFS wanted to take advantage of. Consequently, in February 1979, the PSFS signed an agreement with the Great Atlantic and Pacific Tea Company (A&P) to install Transmatic Money Service devices in 12 supermarket locations. This was part of the PSFS wider strategy “to provide alternative means for delivering banking services to the public” (Hagley Archives: PSFS Collection).

These terminals did not, however, allow for the direct transfer of funds from the customer’s accounts to the retailers. Rather the terminals, which were operated by A&P employees, were activated by a PSFS plastic card that the society issued to customers, and enabled PSFS customers with a Payment and Savings account to make withdrawals and deposits. The terminals also allowed PSFS cardholders and A&P customers to cash cheques.

The equipment used by PSFS, the Hinky Dinky devices, therefore represent an interesting middle ground which improved transaction convenience for consumers, was low risk for the retailer and was relatively less costly for banks and financial institutions than ATMs (Benaroch & Kauffman, 2000). 

One of the most interesting features of the Hinky Dinky terminals as they were deployed by the PSFS and First Federal Savings, was that they represent co-operative initiatives between retail organisations and financial institutions. As mentioned before, this was not necessarily the norm at the time. As the legal counsel to the National Retail Merchants Association (a voluntary non-profit trade association representing department, speciality and variety chain stores) wrote in 1972: “Major retailers… have not been EFTS laggards. However, their efforts have not necessarily or even particularly been channelled toward co-operative ventures with banks,” (Schuman, 1976, 828). These sentiments were echoed by more neutral commentators who similarly highlighted the lack of dialogue between retailers and financial institutions on the topic of EFTS (Sprague, 1974). The extent to which retailers provided financial services to their customers had long been a competitive issue in the retail industry: the ability of chain stores, such as A&P in groceries and F.W. Woolworth in general merchandise, to offer low prices and better value owed much to their elimination of credit and deliveries (Lebhar, 1952). With the advent of EFT retail organisation’s provision of financial services raised the prospect of this becoming a competitive issue between these two industries.

All from J.C Penney 1959  Annual Report
A JC Penney customer applying for a credit card ca. 1950s.

Photo credit: From J.C Penney 1959 Annual Report.

The prospect of a clash between retailers and banks was increased moreover, as there had always been other voices, other retailers, who had been willing to offer credit (Calder, 1999). In the early years of the 20th century, consumer demand for retailers to provide credit grew. This caused tension with the cash only policies of department store such as A.T. Stewart and Macy’s, and the mail order firms Sears Roebuck and Montgomery Ward (Howard, 2015). Nevertheless, it was hard to ignore such demand as evidenced by Sears decision to begin selling goods on instalment around 1911 (Emmet and Jeuck, 1950, 265). Twenty years later, in 1931, the company went a stage further by offering insurance products to consumers through the All State Insurance Company. Other large retail institutions, however, resisted the pressure to offer credit until much later (J.C. Penney for instance would not introduce credit until 1958). Credit activities by large retailers, nonetheless, were determinant for banks to explore their own credit cards as early as the 1940s while leading to the successes of Bankamericacard and the Interbank Association in the 1960s (Bátiz-Lazo and del Angel, 2018; Wolters, 2000).  

The barriers between banks and financial institutions on the one hand, and retailers on the other, continued to remain fairly robust. Signs that this was beginning to change began to emerge in the 1980s, when retailers, such as Sears began to offer more complex financial products (Christiansen, 1987; Ghemawat, 1984; Raff and Temin, 1997). Yet, the more concerted activity by retailers to diversify into financial services, would ultimately be stimulated by food retailers (Martinelli and Sparks, 1999; Colgate and Alexander, 2002). The Hinky Dinky System however shows that a co-operative not just a competitive solution was a very real possibility.   

In 2021 we are witnessing an extreme extension and intensification of these trends. Throughout the ongoing Covid-19 pandemic, the use of cash has greatly declined as more and more people switch to digital payments. In the retail industry, even before the pandemic, POS innovations were becoming increasingly digital (Reinartz and Imschloβ, 2017) as retailers shifted toward a concierge model of helping customers rather than simply focusing on processing transactions and delivering products (Brynjolfsson et al., 2013). Consequently, the retail-customer interface was already starting to shift away from one that prioritised the minimisation of transaction and information costs toward an interface which prioritised customer engagement and experience (Reinartz et al., 2019).

A second feature of the pandemic has been the massive increase in interest in crypto currencies, in its many different forms, around the world. This is most apparent in the volatility and fluctuations in price of Bitcoin but is also evident in the increased prominence of alternative fiat currencies (such as Ether). Indeed, even central banks in Europe and North America are discussing digital currencies, the government of El Salvador has made Bitcoin legal tender, while the People’s Bank of China have launched their own digital currency in China. A further manifestation of the momentum crypto currencies are gaining include the private initiatives of big tech (such as Facebook’s Diem, formerly Libra). Yet, in spite of all of this latent promise, transactions at point of sale with crypto currencies are still minuscule and time and again, surveys by central banks on payment preferences consistently report people want paper money to continue to play its historic role.

It thus remains too early to forecast with any degree of certainty what the actual long-run effects of the virus, social distancing and lockdowns will have on the use of cash, how consumers acquire products and services, and what these products and services are. It is also uncertain whether and if greater use of crypto currencies will lead to a decentralised management of monetary policy (and if so, the rate at which this will take place). It is though almost certain that consumer’s behaviours, expectations and habits will have been altered by their personal experiences of Covid. In this context the story behind “Hinky Dinky” reminds us to be sober at a time of environmental turbulence and wary of extrapolating trends, to better understand the motivation driving the adoption of new payment technology as some of these trends, like “Hinky Dinky”, might look to have wide acceptance but to result in a short-term phenomenon.      

Acknowledgements

We appreciate helpful comments from Jeffrey Yost, Amanda Wick and J. Carles Maixé-Altés. As per usual, all shortcomings remain responsibilities of the authors.


Bibliography

Basker, E. (2012). Raising the Barcode Scanner: Technology and Productivity in the Retail Sector. American Economic Journal: Applied Economics4(3), 1-27.

Bátiz-Lazo, B. (2018). Cash and Dash: How ATMs and Computers Changed Banking. Oxford: Oxford University Press.

Bátiz-Lazo, B., Maixé-Altés, J. C., & Thomes, P. (2011). Technological Innovation in Retail Finance: International Historical Perspectives. London and New York: Routledge.

Bátiz-Lazo, B., Haigh, T., & Stearns, D. L. (2014). How the Future Shaped the Past: The Case of the Cashless Society. Enterprise & Society, 15(1), 103-131.

Bátiz-Lazo, B., & del Ángel, G. (2018). The Ascent of Plastic Money: International Adoption of the Bank Credit Card, 1950-1975. Business History Review, 92(3 (Autumn)), 509 - 533.

Bátiz-Lazo, B., & Wood, D. (2002). An Historical Appraisal of Information Technology in Commercial Banking. Electronic Markets - The International Journal of Electronic Commerce & Business Media, 12(3), 192-205.

Benaroch, M., & Kauffman, R. J. (2000). Justifying Electronic Banking Network Expansion Using Real Options Analysis. MIS quarterly, 197-225

Bucklin, L. P. (1980). Technological-change and Store Operations-The Supermarket Case. Journal of retailing56(1), 3-15.

Calder, L. (1999). Financing the American Dream: A Cultural History of Consumer Credit. Princeton: Princeton University Press.

Colgate, M., & Alexander, N. (2002). Benefits and Barriers of Product Augmentation: Retailers and Financial Services. Journal of Marketing Management18(1-2), 105-123.

Cortada, J. (2004). The Digital Hand, Vol 1: How Computers Changed the Work of American Manufacturing, Transportation, and Retail Industries. Oxford: Oxford University Press.

Chrstiansen, E. T. (1987) Sears, Roebuck & Company and the Retail Financial Services Industry (Part Two). Case 9-387-182. Cambridge, MA: Harvard Business School.

Doms, M. E., Jarmin, R. S., & Klimek, S. D. (2004). Information technology Investment and Firm Performance in US Retail Trade. Economics of Innovation and new Technology13(7), 595-613.

Emmet, B., Jeuck, J. E., & Rosenwald, E. G. (1950). Catalogues and Counters: A History of Sears, Roebuck and Company. Chicago: University of Chicago Press.

Ghemawat, P. (1984) Retail Financial Services Industry, 1984. Case 9-384-246, Cambridge, MA: Harvard Business School.

Hagley Museum and Archives Philadelphia Savings Fund Society Collection 2062 Box 13 POS Program Introduction March 2, 1978.

Howard, V. (2015). From Main Street to Mall: The Rise and Fall of the American Department Store. Philadelphia: University of Pennsylvania Press.

Jarmin, R. S., Klimek, S. D., & Miranda, J. (2009). The Role of Retail Chains: National, Regional and Industry results. In Producer dynamics: New evidence from micro data (pp. 237-262). Chicago: University of Chicago Press.

Lebhar, G. M. (1952). Chain stores in America, 1859-1950. New York: Chain Store Publishing Corporation.

Martinelli, E. and Sparks, L. (2003). Food Retailers and Financial Services in the UK: A Coopetitive Perspective", British Food Journal, Vol. 105 No. 9, pp. 577-590.

Maixé-Altés, J. C. (2012). Innovación y compromiso social. 60 años de informatización y crecimiento. Barcelona: "la Caixa" Group.

Maixé-Altés, J. C. (2019): "The Digitalisation of Banking: A New Perspective from the European Savings Banks Industry before the Internet," Enterprise and Society, Vol. 20, No. 1, pp. 159-198.

Maixé-Altés, J. C. (2020): "Retail Trade and Payment Innovations in the Digital Era: A Cross-Industry and Multi-Country Approach", Business History Vol. 62, No. 9, pp. 588-612.

Maixé-Altés, J. C. (2021): "Reliability and Security at the Dawn of Electronic Bank Transfers in the 1970s-1980s". Revista de Historia Industrial, Vol. 81, pp. 149-185.

Maixé-Altés, J. C. and Castro Balguer, R. (2015): "Structural Change in Peripheral European Markets. Spanish Grocery Retailing, 1950-2007", Journal of Macromarketing, Vol. 35, No. 4, pp. 448-465.

Raff, D., & Temin, P. (1999). Sears, Roebuck in the Twentieth Century: Competition, Complementarities, and the Problem of Wasting Assets. In Learning by doing in markets, firms, and countries (pp. 219-252). Chicago: University of Chicago Press.

Reinartz, W., & Imschloβ, M. (2017). From Point of Sale to Point of Need: How Digital Technology is Transforming Retailing. NIM Marketing Intelligence Review9(1), 42.

Reinartz, W., Wiegand, N., & Imschloβ, M. (2019). The Impact of Digital Transformation on the Retailing Value Chain. International Journal of Research in Marketing36(3), 350-366.

Ritzer, G. (2001). Explorations in the Sociology of Consumption: Fast Food, Credit Cards and Casinos. Thousand Oaks, CA: Sage Publishing.

Ritzer, J. (1984) Hinky Dinky Helped Spearhead POS, remote banking movement. Bank Systems and Equipment, December, 51-54.

Schuman, C. R. (1975). The Retail Merchants' Perspective Towards EFTS. Catholic University Law Review25(4), 823-842.

Sprague, R. E. (1977). Electronic Funds Transfer In Europe: Their Relevance for the United States. Savings Banks International, 3, 29-35.

Sprague, R.E. (1974) Electronic Funds Transfer System. The Status in Mid-1974 – Part 2. Computers and People 23(4).

Stearns, D. L. (2011). Electronic Value Exchange: Origins of the Visa Electronic Payment System. London: Springer-Verlag.

United States. National Commission on Electronic Funds Transfer. (1977). EFT and the public interest: a report of the National Commission on Electronic Fund Transfers. Second printing. Washington: National Commission on Electronic Fund Transfers.

Wolters, T. (2000). ‘Carry Your Credit in Your Pocket’: The Early History of the Credit Card at Bank of America and Chase Manhattan. Enterprise & Society, 1(2), 315-354.

 

Bátiz-Lazo, Bernardo and Buckley, Tom R. August (2021). “Early “Frictions” in the Transition towards Cashless Payments.” Interfaces: Essays and Reviews in Computing and Culture Vol. 2, Charles Babbage Institute, University of Minnesota, 64-75.


About the authors: 

Dr. Bernardo Bátiz Lazo is Professor of FinTech History and Global Trade at Northumbria University in Newcastle upon Tyne, UK and research professor at Universidad Anahuac (Mexico). He is a fellow of the Royal Historical Society and the Academy of Social Sciences. 

Dr. Tom Buckley is currently Lecturer in International Business Strategy at the University of Sheffield. Dr. Buckley received his PhD from the University of Reading’s Henley Business School in 2017. 


 

Top 10 Signs We Are Talking About IBM’s Corporate Culture

James W. Cortada, Senior Research Fellow, Charles Babbage Institute 

Abstract: Using the format of a late night TV humorist's way of discussing an issue, this article defines ten characteristics of IBM's corporate culture deemed core to the way IBM functioned in the twentieth century. The perspective is of the individual employee's behavior, conjuring up with humorous images what they thought of each other's behavior. However, corporate images are serious matters so as historians focus more attention on corporate cultures, such images will need to be understood. The list also demonstrates that one image of IBM personnel as being serious did not always reflect reality within the company.

(PDF version available for download.)

IBM education center in Endicott, NY.
The entrance staircase into an IBM education center in Endicott, NY  used from the 1930s through the 1980s. All employees were routinely required to undergo about two weeks of training each year. 

Between 1982 and 2015, comedian and American television host David M. Letterman hosted a highly popular late-night television comedy show, Late Night with David Letterman. One of his recurring presentations was the “Top 10 List,” which ranked from tenth to first features of an issue. These became wildly popular, were satirized, and taken to be comedic commentary on contemporary circumstances. He and his staff produced nearly 700 such lists, while Letterman hosted over 6,000 programs viewed by millions of people. He published four book-length collections of these lists. The genius of these lists, as in any good comedy, lay in providing a slightly fractured view of reality. On occasion, he commented on corporations, such as GE, AT&T, Exxon, McDonalds, QVC and Westinghouse among others.  

In recent years scholars have become increasingly interested in the history of corporate cultures. For decades historians said corporate cultures were important, but they did little work on the subject. That is beginning to change. I am exploring IBM’s corporate culture, which historians and employee memoirists all claim was central to the company’s ability to thrive and survive longer than any other firm in the IT industry. So, to do that historians have to find new types of documentation informing their study of the subject. That is why we are turning to comedian David Letterman. His lists reflected public interest in a topic, and they captured perspectives people would either agree with or that made sense to them given what they knew of the subject. What better way is there to see what IBM’s image looked like: “10 Signs you might be Working at IBM”:

 

10. You lecture the neighborhood kids selling lemonade on ways to improve their process.”

Thomas J. Watson Sr., head of IBM from 1914 to 1956, made exploring with customers how best to use his tabulators, then computers, a central feature of his selling method. Tens of thousands of sales personnel did this. In the process they and other employees improved IBM’s operations and were seen as experts on efficient, productive business operations. As the company grew in size, so too did its reputation for having “its act together” on all manner of business and technical issues. That reputation expanded, crowned with the acquisition of 30,000 management and IT consultants from PwC in 2002. There was no industry or large company immune from IBM’s views on how to improve operations, nor community organization involving IBMers. (Side note: Your author—retired IBMer—taught his daughters how to improve their Girl Scout cookie sales process.)

IBMers in suits
IBM salesmen dressed for the part.

“9. You get all excited when it’s Saturday so you can wear shorts to work.

The two things most people know about IBM in any decade is its THINK sign and that its employees always wore dark suits, white shirts, regimental ties, black socks and shoes. It turns out, they did that, not because someone at the top of the corporation mandated such a dress code, rather because that is how customers tended to dress. There was no official dress code, although most IBMers can tell a story of some lower level manager sending someone home to change out of their blue shirt or, shockingly out of their brown loafers. But Thomas J. Watson, Jr., CEO between 1956 and 1971 finally opined in the subject in 1971, confessing that IBM’s customers dressed in a “conservative” manner. He thought, “it is safe to say that the midstream of executive appearance is generally far behind the leading edge of fashion change.” So, “a salesman who dresses in a similar conservative style will offer little distraction from the main points of business discussion and will be more in tune with the thinking of the executive.” That is why, “we have always had a custom of conservative appearance in IBM.” People are thus asked, “to come to work appropriately dressed for their job environment.” That’s it: the smoking gun, the root source of the true IBM dress code policy! Millions of people have met IBM employees wearing blue suits. Sociologists point out that every profession has its uniform (e.g., university students blue jeans, cooks their tall white hats, IBMers their wingtips and white shirts). The Letterman quip inferred below is that IBM employees were willing to put in long hours on behalf of their company.

President Thomas J. Watson, Sr, here in the 1950s.
IBM President Thomas J. Watson, Sr, ca. 1950s.

8. You refer to the tomatoes grown in your garden as deliverables.

Deliverables is a word with a long history at IBM, at least back to the 1950s. While its exact origins are yet to be uncovered, technical writers worked with product developers on a suite of publications that needed to accompany all product announcements or modifications to them. In fact, one could not complete a product development plan without including as one of its deliverables a communication plan in writing. The plan had to include when and how press releases, General Information and user and maintenance manuals would be published, and how these would accompany products to customers’ data centers. When IBM entered the services business focusing largely on doing management and strategy consulting in the 1990s with the creation of the IBM Consulting Group use of the word expanded across IBM. Management consultants in such firms as Ernest & Young, Booze Allen and McKinsey, among others, always concluded their consulting projects with a final report on findings and recommendations, which they, too, called deliverables. So as these people came to work at IBM, they brought with them their language. By the end of the 1990s, it seemed everyone was using deliverables to explain their work products. So, the Letterman List got right another IBM cultural trait that proved so pervasive that even employees did not realize they said deliverables.

 

7. You find you really need Freelance to explain what you do for a living.

Oh, this one hurts if you are an IBMer. The “Ouch” is caused by the fact that during the second half of the twentieth century, employees attended meetings armed with carefully prepared slide presentations. They began with Kodak film slides, then 8 x 10 film called foils (IBMers may have had an exclusive in using that word), followed by the precursor of PowerPoint called Freelance. Every manager it seemed carried a short presentation about their organization, what it did and so forth. By the end of the 1960s it seemed no proposal to a customer or higher-level manager was missing the obligatory presentation. When Louis V. Gerstner came to IBM as its new CEO in 1993, he immediately noticed this behavior and essentially outlawed the practice by his executives when meeting with him. He wanted them to look into his eyes and talk about “their business.” Eventually he retired and so by the early 2000s, PowerPoint was back. By the 2010s the nearly two hundred firms acquired by IBM came fully stocked with employees who, too, clutched their slide presentations. Edward Tufte, the Yale professor who is most noted for his multi-decade criticism of PowerPoint presentations, must clearly have had IBM in mind, although he admitted many corporations suffered from similar behavior. He went on to study the role of graphics and presentation of statistics and other data.

 

6. You normally eat out of vending machines and at the most expensive restaurant in town within the same week.

This observation required true insight. One of the TV writers must have interviewed a salesman, consultant, any manager or executive to stumble across this one. IBM people spent a considerable amount of time traveling to visit customers, attend internal and industry meetings, fulfill their normal requirement of two weeks of training every year, attend conferences to make presentations, or to meet with government officials. By the 1980s it was not uncommon for large swaths of IBM to be organized in some matrix manner in which one’s immediate supervisor was in another country while yet another manager with whom one had to work with was perched elsewhere. To kiss the ring, one had to travel to wherever that manager held court. Some professions, such as sales and consulting and middle and senior management, turned themselves into tens of thousands of “road warriors.” So, one might fly to a city and take a customer out to dinner at a magnificent restaurant to build personal rapport and to conduct business, in slang terms sometimes referred to as “tavern marketing.” But then afterwards rushed to the airport to catch the “red eye” overnight flight home or to some other destination to attend yet another meeting. That would require possibly eating vending machine food after an airport’s restaurants closed, at a work location that had no restaurant or when there was no time to rush out for something. IBMers, too, prided themselves in making their flights “just in time,” meaning no time for having a leisurely meal. You were complimented if you reached the airplane’s door just as it was about to be closed.

 

5. You think that “progressing an action plan” and “calenderizing a project” are acceptable English phrases.

Since at least the 1970s, employees putting together those famous slide presentations were retreating from writing full sentences, engaged in the very bad habit of turning nouns into verbs. Technical writers in the firm eschewed such behavior, so too the media relations community. Employees working in headquarter jobs in the United States, were particularly notorious users of nouns. Letterman may not have known of the most widely used example, “to solution” something or its variant “I will solution that problem.” The use of a noun was intended to project force, action, determination, and leadership. Nobody seemed embarrassed by their ignorance of the English language. If one worked in the same building as hundreds or thousands of people without visiting too many other workplaces, local speech patterns became evident. The New York area’s IBM community was notorious; they wanted people to come to them and when that happened visitors were abused with such language. As cultural anthropologists pointed out since Claude Levi-Strauss as early as the 1930s, tribes form their own language tied to their cultures and lifestyles. IBMers were guilty of the same. It is part of the behavior that led to such usages as “foils.”

 

4. You know the people at faraway hotels better than your next-door neighbors.

This has to qualify as true for some road warriors. It ties to No. 6 about vending machines. Consultants, in particular, would leave home on a Sunday night or Monday morning and not return until Friday night, if on long term projects. They were commuting and so when home, took care of domestic chores or spent time with their families. It was—is—not uncommon for employees to know the names of flight attendants and hotel staff, since those individuals, too, had set work schedules. Knowing the name of restaurant staff working near a client’s offices was—is—not uncommon, either. Such knowledge could be exotic, as knowing the flight attendant assigned to one’s Monday morning flight to Orange County, California, and at the same time the doorman at one’s favorite Lebanese restaurant in Paris. This is not conceit, just the reality that IBM employees did a considerable amount of traveling in the course of their career. It was both an attraction and a burden. Travel made work interesting but also placed a burden on one’s Circadian body rhythms not helped by rich food or vending machine delights.

IBM awards dinner held in Endicott New York in the late 1940s
The company made sure families were also involved in IBM-sponsored events to strengthen the bonds and its corporate culture. This is an awards dinner held in Endicott, New York in the late 1940s.

3. You ask your friends to “think outside the box” when making Friday night plans.

In IBM’s culture solving problems is a practice shared by all employees almost all the time. It became a worldview, a framework, and an attitude toward activities in both their professional and private lives. It has been fostered within the firm since the 1910s, largely because the products it sold required addressing customer issues and others challenging the internal operations at IBM. Over the years, language and phrases emerged that were embraced to speak to that issue. Thinking outside the box spoke to the need often required to come up with a solution to a problem that had not been tried before. That behavior prized imagination and equally so, a reluctance to accept no as an answer to a request. For a century, for example, salesmen and engineers were taught when encountering an objection or a problem not to take it personally, but to decompose it to understand what it really is, and then come up with a “fix” for it. There was an age-old sales adage that helps here: “The selling doesn’t start until the customer says no.” Flipping a “no” into a “yes” requires “thinking outside the box.” The same mindset was applied in one’s private life too.

 

2. You think Einstein would have been more effective had he put his ideas into a matrix.

Someone must have had spies in IBM or was a business school professor of organizational theory, because by the 1960s, much of IBM was organized like a matrix. As one student of IBM’s culture with experience studying corporate structures explained: “I’ve never seen this in any other company,” adding, “with all those dotted lines and multiple bosses.” However, it worked because everyone subscribed to a common set of values and behaviors, and all had documented performance plans that stated explicitly what they were to do. Where one sat in IBM insured that in everyone’s slide presentations there would be an organization chart to which the speaker could point out to explain where they perched. Another observer opined that, “It is probably the most complex organization that I have seen,” enter the illusion to smart Albert Einstein. Hundreds of thousands of employees lived in such matrices and somehow it all worked, because IBM made money and profits, with a few exceptions that Letterman and his audience might not even have been aware of, since most stockholders were the rich and institutional investors.

Following Letterman’s practice: “And the Number One Sign That You Work at IBM” with a drum roll, of course:

 

1. … You think a ‘half-day’ means leaving at 5 o’clock.”

Employees had a work ethic that customers saw displayed in many ways: travel schedules, customer engineers working around the clock and over weekends to install and repair hardware and software, consultants who showed up at 8 in the morning and left at 7:30 to dine at one of those fine restaurants or to wolf down pizzas as they prepared for a client presentation to be made first thing in the morning. It was a life of endless dinners with clients and one’s management, or student teams working on case studies until midnight in some training program. Weekend planning retreats were all too common, especially in the fall as IBMers prepared for next year, or for spring reviews which were a ritual requiring weeks of preparation for when executive management would swoop in to inspect, often knowing as much about one’s business as the presenters. The company did nothing to hide the long hours its employees put in—it exemplified the wisdom of the Grand Bargain. This bargain held that in exchange for working loyally and to a great extent, one was assured a lifetime of employment at IBM. The 5 o’clock comment recognized that employees were seen as far more loyal to the firm and defender of its ways than evident in other companies. One sees such comments in bits and pieces in memoirs and accounts of the IT industry, but the Letterman list cleverly summed it up.

25 years of service image
For the most part for over eight decades employees found their employment with IBM a source of pride. Those with 25 years of “service” were considered an elite group, at least until the 1990s.

So, what was the image the Letterman List portrayed of IBMers to millions of people? While many had a good laugh, it affirmed that IBM’s employees were serious, knowledgeable, seemingly always on duty (even at home and in their neighborhood), focused on results, were imaginative, and had their own ways of doing and talking. IBM had purposefully worked on developing that image since the 1910s and a century later still retained it. It was part of a larger, hardly discussed, corporate strategy of creating an information/business ecosystem in the world of IT which it dominated.

But, of course, what Letterman may have missed are so many other lists, such as those hundreds of line items defining IBM (e.g., I’ve Been Moved). IBMers did not sleep wearing their black wingtip shoes, nor cut their lawns wearing white shirts. They actually had a sense of humor as historians are beginning to discover. IBMers conjured up comedic skits across the entire company around the world. They did standup joke telling, and, of course, sang songs, often with lyrics tailored to some Letterman-like observations about IBM.

But here is the punch line. Letterman never drew up this list, it is a spoof, prepared by an IBMer that circulated around the Internet. It was probably written in 1997, while IBM’s old culture was still much in place, when what was said here were IBM employee insights into the company’s culture. In short, there is more accuracy in this list than the comedian could have conjured up. But it was done so well that you have to admit, you believed it.

On a more serious note corporate image is an important issue. Today, for example, Facebook is being criticized for being irresponsible in supporting the flow of accurate information through society. It must have some employees who cringe that the driveway into their corporate headquarters is named Hacker Way, which suggests this is a company with teenage-like behavior when now it has become an important component of modern society. IBM studiously avoided such traps. Amazon, which enjoyed a positive image for years, recently was criticized for its working conditions that led to an attempted unionization effort at an Alabama facility, highlighting its aggressive actions to crush the initiative. President Joe Biden even supported publicly the unionizing effort. IBM never unionized in the United States, it never had a counter-culture name for a road and it never spoke about breaking things, rather about building them. From the beginning it wanted to be seen as a firm bigger than it was and as a serious, responsible pillar of society. Today’s business titans have much to learn from IBM’s experience.

One would wonder how Letterman would treat Apple, Microsoft, Facebook, Cisco, Amazon, Verizon, or Disney? He poked fun at other companies and, at least within IBM when he was popular on television, employees came up with their own Top 10 Lists all the time. If these other companies would be embarrassed by the humor, it suggests that Letterman has some business management lessons to teach them too.  


Bibliography

Cortada, James W. (2019).  The Rise and Fall and Reinvention of a Global Icon.  Cambridge, Mass.: MIT Press.

Pugh, Emerson W. (1995).  Building IBM: Shaping an Industry and Its Technology.  Cambridge, Mass.: MIT Press.

Watson, Thomas J., Jr. (1963, 2000). A Business and Its Beliefs: The Ideas that Helped Build IBM.  New York: McGraw-Hill.

___________ and Peter Petre. (1990).  Father, Son, and Co.: My Life at IBM and Beyond. New York: Bantam.

 

Cortada, James W. (July 2021). “Top 10 Signs We Are Talking About IBM’s Corporate Culture.Interfaces: Essays and Reviews in Computing and Culture Vol. 2, Charles Babbage Institute, University of Minnesota, 55-63.


About the authorJames W. Cortada is a Senior Research Fellow at the Charles Babbage Institute, University of Minnesota—Twin Cities. He conducts research on the history of information and computing in business. He is the author of IBM: The Rise and Fall and Reinvention of a Global Icon (MIT Press, 2019). He is currently conducting research on the role of information ecosystems and infrastructures.


 

NFTs, Digital Scarcity, and the Computational Aura

Annette Vee, University of Pittsburgh

Abstract: Here, I draw on Walter Benjamin’s discussion of the aura of original art in “The Work of Art in the Age of Mechanical Reproduction” to explore the appeal of NFTs (non-fungible tokens) in the age of digital reproduction. I explain what NFTs on the blockchain are and point to other attempts at scarcity in digital contexts, including Cryptokitties and the Wu-Tang Clan’s Once Upon a Time in Shaolin. Just as Bitcoin emerged from the 2008 financial crisis, NFTs have gained traction in the Covid-19 pandemic, demonstrating that scarce, rivalrous, positional goods are desirable even when computational networks afford perfect replication at scale.

(PDF version available for download.)

*Please note: Explicit language quoted in this article may be offensive to some readers.

Beeple's Everydays: The First 5000 Days
Figure 1: Screenshot of Christie's page showing the final price and image of Beeple's Everydays: The First 5000 Days, https://onlineonly.christies.com/s/beeple-first-5000-days/beeple-b-1981-1/112924

 

“holy fuck.” Beeple tweeted on 10:42AM Mar 11, 2021, when his artwork “Everydays: The First 5000 Days,” a jpg file measuring 21,069 pixels square, sold for $69,346,250 at auction on Christie’s online. Holy fuck, indeed: the first all-digital artwork sold at Christie’s—a composite of edgy, meme-worthy images the artist had posted every day since 2007--fetched a price in the same league as works by van Gogh, Picasso, Rothko and Warhol. In 1987, another Christie’s auction made headlines: Vincent van Gogh’s Still Life: Vase with Fifteen Sunflowers (“Sunflowers”) sold for nearly $40 million, tripling the record from any previous sale of art. Putting aside comparative judgements of quality, the nearly $70 million for Everydays was a lot of money to pay for a piece of art that could be perfectly replicated and stored on any given laptop. What made Everydays more like Sunflowers than millions of other jpgs?

Everydays was minted on 16 February 2021 and assigned a non-fungible token (NFT) on the Ethereum blockchain. This NFT authenticates Everydays and makes it unique from another bit-for-bit copy of the same file. Where a van Gogh listing on Christie’s site might declare medium, date, and location (e.g, oil on canvas, 1888, Arles), Everydays lists pixel dimensions, a token ID, wallet address, and smart contract address. In a Special Note on the Everydays auction, Christie’s declared it would accept the cryptocurrency Ether, but only in digital wallets hosted by a select group of platforms. Implicit in the listing of these addresses, the token ID, and trusted platforms is an attempt at digital authenticity.

Digital reproduction enables exact copies of art. Even when artists employ watermarks to encourage payment for digital art, the same tools that make the images and mark them can be used to restore and replicate them. NFTs secure digital art not by changing the file itself, but by changing its provenance. NFTs attach a unique identifier, or token, to represent the art on the blockchain. Their non-fungibility differentiates them from cryptocurrency relying on the blockchain. Bitcoin or Ether, for example, are fungible: any Bitcoin spends the same as any other Bitcoin. And like the fiat currency of the dollar, Bitcoin can be spent anywhere that particular cryptocurrency is accepted. In contrast, a non-fungible token is intentionally unique and cannot be spent. But because any given NFT or cryptocoin has a unique position on the blockchain, they cannot be counterfeited. Blockchain security relies on long chains of transactions, each dependent on the previous transaction, with the entire series of transactions made public. Altering one transaction would require altering the copy of the record decentrally stored across thousands of machines simultaneously. In other words, it's effectively impossible.

When a digital piece of art has an NFT associated with it, it’s been marked as authentic by the artist or an agent with the power to authenticate it. While the digital art itself might be able to be reproduced, the version that’s on auction ostensibly has the imprimatur of the creator. It’s been digitally touched by the artist. You can, too, own an exact copy of Everydays by downloading it here. But you can’t be part of the transaction that was recorded on the Ethereum blockchain, which involved Beeple transferring the token to the winner of the auction. (For more details on the exact chain of transactions from digital file to blockchain to auction and purchase, Robert Graham offers a more technical breakdown.)

Renaissance artists such as Leonardo da Vinci painted in a studio with the help of assistants. What makes a da Vinci a da Vinci isn’t that he painted all of it, but that he painted at least some of it, and that a community of experts take some responsibility for the claim that (Langmead et al.). We can compare a da Vinci painting to Jack Dorsey’s first tweet, which has been reproduced everywhere but is now associated with an NFT (Boucher). It’s like having Jack Dorsey touch the tweet before selling it, adding his splash of paint. The scarcity is what makes it valuable; the NFT buyer owns something that others do not and cannot. One of the reasons that Salvator Mundi sold for $450 million—shattering all previous records for art auctions—is that it is one of only 20 paintings attributed to da Vinci (Langmead, et al.). Dead artists generally fetch more for their work than living artists because they aren’t making any more art (Jeff Koons holds the living artist record for his metal sculpture Rabbit, sold at 91.1 million in 2019). For NFTs as well as physical art, scarcity depends on human trust in the creator as well as the system that verifies its connection with the creator.

Still Life: Vase with Fifteen Sunflowers
Figure 2: Still Life: Vase with Fifteen Sunflowers, by artist Steven Vee.

I own a version of Still Life: Vase with Fifteen Sunflowers, which hangs on a wall in my family room. This Sunflowers is an original piece of art, has a traceable provenance, and is beautiful. It has an aura just as the one that auctioned for nearly $40 million in 1987. But the reason my copy wouldn’t fetch the same price at auction (though I admit I haven’t tried) is that the artist is Steven Vee--my dad. His paintings are highly valued in the diaspora of my hometown but are unknown to the van Gogh connoisseurs who bid at Christie’s. There’s the matter of the work’s age (15 years vs. 100 years) and materials (acrylic vs. oil paints). But the main difference between the two pieces of art is their aura: who imbued them with the aura, how they painted them, where, and who has owned them. My Sunflowers is valuable to me, but probably not to Christie's. (Although if it were, I would let it go for a mere $20million—sorry, Dad.)

Digital scarcity

Scarcity is a default attribute for a physical piece of art: both Vincent van Gogh's Sunflowers and Steve Vee's Sunflowers have multiple versions, but each individual painting is unique. Artificial scarcity has been the primary solution to the aura problem for mechanical reproduction of art. Limited print runs can ensure that a collector has one of only 20 prints, even if it’s technically possible to produce hundreds of them. Although the print may not be directly touched by the creator, its scarcity gives it value.

But scarcity is tricky with digital work. The fact that digital files are perfectly and infinitely reproducible makes it difficult to limit copies, at least once a digital file is released to another party. Perfect replication is one of the advantages of digitality, but it works against exclusive ownership. NFTs are a solution to this problem, but there have been others, each specific to its digital and social context.

In virtual spaces, scarcity emulates physical spaces. In the virtual world of Second Life, which was popular in the early 2000s and had a GDP to rival Samoa in 2009, users can build and buy property (Fleming). While the number of islands on which to build is theoretically infinite, the particular island and construction is ensured to be unique because the world is hosted by one company, Linden Lab. In Second Life, particular goods and construction could be copied and were the subject of intellectual property debates and court cases. And property ownership is subject to Linden Lab’s continued management and discretion.

Second Life
Figure 3: Launched in 2003, Second Life is still around. Screenshot from https://secondlife.com/

In high-stakes online poker in the early 2000s, big-time players sold expensive coaching manuals as pdf files, and protected their scarcity by introducing a small variation in the version—an extra comma on page 34, for instance (Laquintano). It’s very easy to circulate a pdf: pdfs have a small file size and are easily stored and read on default programs on consumer machines. But anyone paying that much for a pdf wants security in knowing that their pdf won’t circulate easily, especially since the manuals contained poker strategies, so, like limited print runs, the pdfs lose value the more widely they are held. If the pdf manual got out, an author could trace the particular variation back to the original buyer and enforce the sales contract with social consequences in the poker community.  

Another tactic to make a digital work scarce is to keep it out of digital networks altogether. This was Cilvaringz and RZA’s tactic with the Wu-Tang Clan’s 2015 album Once Upon a Time in Shaolin. Just one copy of the album was pressed, put in a bejeweled silver box with leather-bound liner notes, then put up for auction, where it fetched $2 million. The book accompanying the album says specifically: “This book has not been catalogued with the Library of Congress." The scarcity was ensured through the singular pressing as well as an agreement forbidding the buyer to exploit it commercially for 88 years (though it didn't curtail free distribution). The buyer later turned out to be the infamous Martin “Pharmabro” Shkreli, the album was seized by the US government in trial, and Shkreli finally streamed it in celebration of Trump’s victory in 2016.

With Once Upon a Time in Shaolin, the Wu-Tang Clan recaptured the aura of original art in digital music. Cilvaringz and RZA were frustrated at the devaluing of music through pirating and streaming and sought to make Once Upon a Time in Shaolin an art object. In an interview with Rolling Stone, RZA said, "It's kind of crazy. The record has become an entity, very different from a lot of albums. It's like the Mona Lisa. It's got its own folklore, and that's what me and [co-producer] Cilvaringz wanted." Speaking of digitized music, RZA said, “OK, nobody don't see the value on it, and we gonna put a value on it. We wanna say, 'This is what we think it's worth'" (Grow). On the album’s website, they channel Benjamin’s description of the aura in the age of mechanical reproduction:

Mass replication has fundamentally changed the way we view a recorded piece of music, while digital universality and vanishing physicality have broken our emotional bond with a piece of music as an artwork and a deeply personal treasure. […] We hope to inspire and intensify urgent debate about the future of music, both economically and in how our generation experiences it. We hope to steer those debates toward more radical solutions and provoke questions about the value and perception of music as a work of art in today’s world.

While a singular, high-profile release of an album might not be a general solution to the aura problem of digital art, it certainly worked for the Wu-Tang Clan.

Once upon a time
Figure 4: Screenshot from the official website of Once Upon a Time in Shaolin, http://scluzay.com/.

Enter the Blockchain

Blockchain technology enables new approaches to artificial scarcity for digital work. The protocol for the blockchain was specified with the release of a white paper describing Bitcoin currency by Satoshi Nakomoto (a pseudonym) in 2008, at the height of the financial crisis. The protocol Nakomoto described took care of several problems with cashless digital transactions, including authorization, privacy, and double-spending. Prior to the Nakomoto white paper, it was only possible to check these boxes with the help of a trusted financial institution. Another problem with previous attempts at digital currency was uptake, or literal buy-in. In 2008, with trust in these institutions at a nadir, Bitcoin was a revelation (Brunton). People were ready to trust a new computational protocol.

Blockchain is essentially a ledger of transactions, with each transaction occupying a unique position on the ledger’s chain of records. The ledger is recorded not centrally in a bank’s records, but decentrally, on the computers of the participants. The ledger of record is determined by consensus and influenced by who carries the record of the longest chain of transactions. So it’s not possible for an interloper to drop in and change the consensus ledger, unless they somehow control 51% of the participating recorders.

Transactions are grouped into blocks to be verified by participants, who must crack a complex computational problem to verify the block. This process is called “proof of work” because the problem requires a huge amount of computational brute force to solve. Participants, called "miners" because they are mining for computational solutions, are incentivized to verify blocks with a chance to earn cryptocurrency if they are the first to crack the problem. The enticing incentive to verify transactions along with the huge expenditure of resources required to solve the problem—which is intentionally and arbitrarily difficult—is why cryptocurrency is so environmentally destructive. Computation requires energy, and millions of competing processors dedicated to solving an intentionally difficult problem adds up to a lot of energy expenditure.

To combat the wasted energy problem, “proof of stake” is a newer alternative to “proof of work” for block verification. Proof of stake effectively lets people bet their assets on their verification. Someone would need to control 51% of any cryptocurrency in order to defraud the ledger. For a cryptocurrency like Algorand, which uses proof of stake, the use of arbitrary and secret selection of block certifiers makes it especially difficult to corrupt enough users to defraud the ledger. Proof of stake routes around brute force computation and thus some of the environmental destruction of blockchain. Rather than favoring the biggest processors, it favors the biggest accounts.

The auras of assets other than money

Any currency is just an abstract representation of value, and so after the Bitcoin white paper, it didn’t take long to figure out how to put assets other than currency on the blockchain. Through smart contracts, the Ethereum protocol enabled property and digital art to record value on the blockchain. A smart contract is a block of code that can automatically execute the terms of a contract. For instance, a smart contract can enable cryptocurrency to be exchanged in response to a triggering event like a digital file transfer and then record the transaction on a blockchain. To integrate non-digital information as an event—say, a death in the case of a will—a smart contract relies on a trusted “oracle” such as a newspaper record (or an oracle network such as Chainlink) to convert that information and trigger a distribution of assets. Many cryptocurrency protocols now include code execution, along with scripting languages and other infrastructure, to enable such smart contracts. Ethereum is the most popular of these protocols for NFTs. 

Decentraland is a more recent take on a virtual world like Second Life, with real property rights in virtual spaces, but instead of centralized control such as in Second Life, Decentraland is controlled by users and the technology of the DAO (Decentralized Autonomous Organization) made possible on the blockchain. Decentraland ties virtual land purchases to NFTs recorded on the Ethereum blockchain.

 

Cryptokitties
Figure 5: Screenshot demonstrating that Cryptokitties have varying exchange values in ETH, the Ethereum cryptocurrency, https://www.cryptokitties.co/catalogue

In 2017, the site Cryptokitties launched virtual collectible kitties that were registered as NFTs on the Ethereum blockchain. Cryptokitties emulate the artificial scarcity of baseball cards and Pokémon, but use the NFT protocol instead of physical cards to ensure that scarcity. Cryptokitties have a unique slate of traits (“Cattributes”), are released in generations, and can be bred to make more kitties. Cryptokitties are cute and silly, but they are actual assets on the blockchain and have actual (though widely varying) value in Ether (Ethereum’s cryptocurrency). The developers of Cryptokitties set out with a goal of introducing new users to cryptocurrency. Cryptokitties are, then, cute missionaries for a new financial order.

Cryptokitties have as much value as any collectible item like Beanie Babies and Pokémon cards. The developers, who refer to Cryptokitties as a kind of game, write,

Users spend 10-100x more on NFT assets than typical “in-game” digital assets because NFTs guarantee authenticity, scarcity, durability, and true ownership, which means NFTs have something very few digital components currently have: tradable value outside the ecosystem in which they were created. As an owner, I can sell my NFT assets at any given time – or I can keep them forever, passing them on like a family heirloom, from one generation to the next.

Gen Z+1 is certainly looking forward to receiving their cedar chest/Ethereum wallets of Diamond-gene Kitties.

NBA’s Top Shot is directly emulating the marketplace of sports cards through NFTs of “Moments” collected into Packs. Top Shot Moments give you the opportunity to “own your fandom.” The moments are “limited edition guaranteed by the blockchain” and are classified as Common, Rare and Legendary. In March 2021, Packs vary from $9 (Common) to $999 (Legendary). Two million Moments have been sold in the Marketplace, and the Legendary Steph Curry Dec 15, 2019 jumpshot, NY v. Cali, #1/4000 edition, is currently priced at $3,333. Anyone could find a clip of this jump shot on YouTube, and so the NBA has designed the Top Shot Moments to be more sophisticated than clips—they’re slick virtual cubes displaying stats along with video. But what distinguishes Top Shots from video clips is the artificial scarcity of exclusive ownership "guaranteed by the blockchain."

TopShot Moments
Figure 6: Screenshot from NBA TopShot, showing the TopShot Moments virtual cubes, https://nbatopshot.com/about

Cryptokitties and Top Shot Moments are infusing value into bits by making them scarce and capitalizing on the human propensity to hoard scarce goods. Artist Kevin Abosch calls NFTs “layers of hexadecimal code, alphanumerical proxies to distill emotional value” (Schachter). The blockchain platform has made waves in its creation of speculative capital, but with NFTs it creates emotional value as well. The blockchain registers auras.

Help someone stole my Internet tulips

The current bubble of speculation on blockchain based assets is part of the excitement. What NFTs are worth is determined by what others think they are worth—which is the same as in the physical art market, too. And the same for countless other speculative bubbles such as the Dutch tulip bubble in 1636 and the South Sea bubble in 1720, which caused Pope, Swift and many others to lose money and prompted Daniel DeFoe to call finance “Air-Money.” As Gayle Rogers details in a new book, Speculation, the risks and excitement of finance and investment have a long history. With wild speculation comes fraud and theft. As in the 2008 housing bubble, if the value of an asset keeps increasing, the incentive to verify it decreases. Buy in, cash out before the bubble bursts.

The information an NFT stores on the blockchain is effectively a link to the artwork and its metadata about the artist and provenance. The blockchain information might be immutable, but what it points to can be unstable, as Jonty Wareing recently pointed out. Artists such as Beeple and Grimes use IPFS (Interplanetary File System) to host this information. More secure than a URL, which is generally reliant on one host, the IPFS allows multiple hosts for content. But with hosting distributed, no particular host has a responsibility to keep the files online (Kastrenakes). And when files go offline, there is no verification for the NFT. CheckMyNFT, a site spun up just recently in response to the NFT craze, checks whether NFTs are hosted and verified. The site has found that even high-priced NFTs by high-profile artists go offline regularly. And if the blockchain register points to an empty address, an NFT is merely an Air-Asset. Ownership of the NFT then boils down to a digital paper trail of provenance, effectively ending at: “Trust me, I own these bits.”

An artist can theoretically "touch" multiple versions of the same piece of art—that is, mint multiple NFTs for identical pieces. That's because NFTs do not confer copyrights and there's no such thing as an "original" digital file. Since there's nothing stopping an artist from minting multiple NFTs for the same item, again, it’s trust that makes an NFT valuable. Also, without a central certification system for artists and agents, there's not much to prevent someone from fraudulently claiming to represent an artist and issuing NFTs. So, an NFT's value is contingent on whether the community of potential buyers trusts the artist, agent, and the art's authentic connection to both.

Outright theft of NFTs is also perhaps a greater risk than it is for physical assets. You just need someone’s password to their digital wallet. And as the digital security community often says, the weakest link in the chain is always the human. Passwords can be easy to crack or inadvertently shared. The security of any digital wallet depends on the security of its hosting service as well. In its auction of Everydays, Christie’s wisely specified which wallet hosting services it would accept.

 

Special
Figure 7: Screenshot of Christie's note specifying that the cryptocurrency Ether would be accepted for payment, but only as held by certain digital wallet hosts. https://onlineonly.christies.com/s/beeple-first-5000-days/beeple-b-1981-1/112924

As Michael Miraflor discovered when his NFTs were stolen on Nifty Gateway, the “trustless” virtue of the blockchain means human coordination is less necessary, but it also means that coordination and trust is unavailable when the system encounters problems like theft. Whoever has the NFTs owns the NFTs, regardless of how they got them. When his NFTs were stolen, Miraflor was able to get charges reversed on his credit card; however, he could not recover the stolen NFTs because transactions on the blockchain cannot be reversed by design. So the NFTs now belong to someone else, and the system is working as intended (@phantsy_prantz). Many commenters on Miraflor's Twitter thread were unsympathetic about the theft of virtual assets. @HeadlightsGoUp alluded to Miraflor’s participation in an NFT bubble: “Help someone stole my Internet tulips.”

And just as an owner of a physical artwork can burn it or throw it away, so can owners of NFTs. Ox_b1, “a pseudo anonymous crypto whale” who has over $500 million in crypto assets, bought a piece of NFT art by Lindsay Lohan for $43K in Ether and asked a community what to do with the asset. They voted to burn it. So, Ox_b1 transferred the NFT to a burn address: 0x00000. And poof, it’s gone. “NFT-dom is not all bad!” artist Kenny Schachter declared. Nice to watch celebrities get their just desserts, although Lohan still profited from the sale. 

And what about the most valuable NFT art of all, Everydays? If you would like to download your very own identical copy of it, you can here on IPFS Gateway, using the hash of the file. The hash is a mathematical output of an algorithm run on a digital file. Different digital files, even if they’re visually similar, will produce different hashes. Thus, a hash is a way of verifying a file or an exact copy of a file. If you download that exact copy of Everydays, you’ll even have the right hash. You won’t own the copyright to it, but neither does Metakovan, who paid $69 million for it. Beeple, just like any other artist selling their work, retains copyright of a work upon its sale unless otherwise stated.

So, what Metakovan purchased for nearly $69 million isn’t the art or the copyright, but a place on an immutable, public transaction record—and a lot of media coverage. As Kal Raustiala and Christopher Jon Sprigman point out, Everydays is a “virtual Veblen good,” or a kind of good one purchases because it is expensive. Thorsten Veblen described the phenomenon of “conspicuous consumption,” especially among the newly wealthy. The value of the good is the social status it confers. Metakovan, the purchaser of Everydays, was behind a Beeple museum on the virtual world Cryptovoxels. As art critic Ben Davis describes, the museum isn’t a particularly good place to show Beeple’s artwork—but it is instead a platform for buying B.20 tokens, or partial shares in the Beeple collection. Beeple’s NFT artwork purchased by crypto-speculators serves, then, as a portfolio for further crypto speculation. We can all get in on the virtual ground floor.

NFTs rely on a chain of social valuation tied to digital scarcity, and that scarcity contributes to value. But the supposed scarcity of NFTs relies on a chain of humans, massive computational expenditure, file hosts, trust in digital registers, and speculation on social value. Which means their value is dependent on people, art, finance, and digital security all at once. No wonder many call NFTs a “house of cards.”

NFTs
Figure 8: “NFTs are dumb. Please go outside, do drugs, & have sex like normal” Wilmington, DE, April 2021. Photo by Karl Stomberg. Art by @vivideman https://twitter.com/KFosterStomberg/status/1388209346648084483/photo/1. Used with permission. 

NFTs as a novel nexus of art and value

Like Bitcoin, NFTs arose to prominence in a crisis, and both have been both a reflection and a response to the specific nature of that crisis. As during the banking crisis of 2008, during the Covid-19 pandemic of 2020-21, the ground is primed for radical rethinking of value that RZA and Cilvaringz asked for. We’re all jolted out of our routines, physical trading and transport of goods is limited, cryptocurrency has gained a foothold in mainstream finance, and we’re bolted to our computers with only virtual materials as a means of creation. Out of this context, NFTs evolved a novel nexus of art and value.

But while this particular nexus might be new, attempts at value and scarcity with new technological platforms is not. Manufacturing, film, and photography prompted Benjamin to consider how artificial scarcity contributed to the aura of art. That we now instill emotional value in digital art, blockchain registers, an Interplanetary File System, and non-fungible tokens is another reminder that technology echoes humanity. We have created scarce, rivalrous, positional goods even when replication and scale are key affordances of computation.

Acknowledgments: Thank you to Alexandria Lockett for inspiring this piece by pointing me to the Beeple sale and asking my thoughts. I am also grateful to Tim Laquintano, Alison Langmead, and Gayle Rogers for feedback on the draft in process and to Steve Vee, @vivideman, and Karl Stomberg for letting me feature their art. Thanks also to Jeffrey Yost and Amanda Wick for launching an open access platform for this kind of work, as well as helpful comments on the draft.


Bibliography

@beeple. “holy fuck.” Twitter, 11 Mar. 2021, 10:42 a.m., https://twitter.com/beeple/status/1370037462085595137.

Benjamin, Walter. (1968). “The Work of Art in the Age of Mechanical Reproduction.” Illuminations, edited by Hannah Arendt, Fontana, pp. 214–18.

Boucher, Brian. (2021). “Twitter Founder Jack Dorsey Is Auctioning Off the World’s First Ever Tweet as an NFT—and the High Bid Is Already $2.5 Million.” Artnet, March 2021. https://news.artnet.com/market/jack-dorsey-nft-tweet-1950279.

Brunton, Finn. (2019). Digital Cash: The Unknown History of the Anarchists, Utopians, and Technologists Who Created Cryptocurrency. Princeton University Press.

“Check My NFT.” Check My NFT, https://checkmynft.com/. Accessed 9 Apr. 2021.

“Cryptokitties.” Cryptokitties, https://www.cryptokitties.co/. Accessed 9 Apr. 2021.

Davis, Ben. (2021). “I Visited the Digital Beeple Art Museum and All I Got Was an Aggressive Pitch for My Money.” Artnet, March 25,2021. https://news.artnet.com/opinion/beeple-b-20Museum review-1954174.

“Decentraland.” Decentraland, https://decentraland.org/. Accessed 9 Apr. 2021.

Fleming, Nic. (2009). “Virtual World Theft Heads to Real Life Court.” Computer Weeklyhttps://www.computerweekly.com/news/1280090966/Virtual-world-theft-heads-to-reallifecourt.

Graham, Robert. (March 20, 2021). “Deconstructing that $69million NFT.” Security Boulevard, https://securityboulevard.com/2021/03/deconstructing-that-69million-nft/.

Grow, Kory. (2018). “RZA Wanted to Buy Martin Shkreli’s Wu-Tang Album Back for Himself.” Rolling Stone, https://www.rollingstone.com/music/features/rza-talks-martin-shkrelshaolin-wu-tang-album-w518574.

@HeadlightsGoUp. “Help someone stole my internet tulips.” Twitter, 15 March 2021, 11:05 a.m., https://twitter.com/HeadlightsGoUp/status/1371477584781979650.

@jonty. “Out of curiosity I dug into how NFT's actually reference the media you're "buying" and my eyebrows are now orbiting the moon.” Twitter, 17 March 2021, 8:30 a.m. https://twitter.com/jonty/status/1372163423446917122

@kennyschac. Someone bought #lindsaylohan nft and destroyed it. C’mon you must admit NFTdom is not all bad!” Twitter, 11 Mar 2021, 8:40 a.m. https://twitter.com/kennyschac/status/1370006659096064007.

Kastrenakes, Jacob. (2021). “Your Million Dollar NFT Can Break Tomorrow If You’re Not Careful.” The Verge, March 2021. https://www.theverge.com/2021/3/25/22349242/nftmetadataexplainedart-crypto-urls-links ipfs.

Langmead, Alison, et al. (2021). “Leonardo, Morelli, and the Computational Mirror.” Digital Humanities Quarterly, vol. 15, no. 1, http://www.digitalhumanities.org/dhq/vol/15/1/000540/000540.html.

Laquintano, Timothy. (2016). Mass Authorship and the Rise of Self-Publishing. University of Iowa Press.

@michaelmiraflor. “Someone stole my NFTs today on @niftygateway and purchased $10K++ worth of today's drop without my knowledge. NFTs were then transferred to another account. I encourage EVERYONE to please check their accounts ASAP. Could use everyone's help here please RT!” Twitter, 14 March 2021, 4:39 p.m., https://twitter.com/michaelmiraflor/status/1371199359996456960.    

“NBA Top Shot.” NBA Top Shot, https://nbatopshot.com/. Accessed 9 Apr. 2021.

“Online Auction 20447.” (March 11, 2021). Christie’s, https://onlineonly.christies.com/s/beeplefirst5000-days/beeple-b-1981-1/112924.

@phantsy_prantz. “it's literally impossible by design you clearly don't understand the technology very well by the terms of ‘ownership’ of an NFT, the new owner is the only owner of those tokens, regardless of anyone's feelings this is the system working as intended.” 17 Mar., 2021, 1:17 p.m., https://twitter.com/phantsy_prantz/status/1371510963405529090.

Rogers, Gayle (2021). Speculation: A Cultural History from Aristotle to AI. Columbia University Press.

Raustiala, Kal and Christopher Jon Sprigman (2021). “The One Redeeming Quality of NFTs Might Not Even Exist.” Slate, April 14, 2021. https://slate.com/technology/2021/04/nftsdigital-art-authenticity-problem.html.

Schachter, Kenny. (2021). “Are NFTs the Next Tulip Bubble? Kenny Schachter Doesn’t Car and He Sold His Own Grandma on the Crypto Web to Prove It.” Artnet, Mar. 2021. https://news.artnet.com/opinion/kenny-schachter-on-nfts-continued-1950407.

“Second Life.” Second Life, https://secondlife.com/. Accessed 9 Apr. 2021.

 

Vee, Annette. (June 2021). “NFTs, Digital Scarcity, and the Computational Aura.Interfaces: Essays and Reviews in Computing and Culture Vol. 2, Charles Babbage Institute, University of Minnesota, 38-54.


About the author:

Annette Vee is Associate Professor of English and Director of the Composition Program, where she teaches undergraduate and graduate courses in writing, digital composition, materiality, and literacy. Her teaching, research and service all dwell at the intersections between computation and writing. She is the author of Coding Literacy (MIT Press, 2017), which demonstrates how the theoretical tools of literacy can help us understand computer programming in its historical, social and conceptual contexts.


 

Everyday Information Studies: The Case of Deciding Where to Live

Melissa G. Ocepek and William Aspray

Abstract: This essay introduces everyday information studies to historians of computing. This topic falls within the subdiscipline of information behavior, one of the main subject areas in information studies. We use our recent edited book, on Deciding Where to Live (Rowman & Littlefield, 2021), as a means to illustrate the kinds of topics addressed and methods used in everyday information studies. We also point the reader to some other leading examples of scholarship in this field and to two books that present an overview of the study of information behavior.

(PDF version available for download.)

This essay introduces everyday information studies to historians of computing. The story of this field of study and its history are too large to tell in detail here. This topic falls within the subdiscipline of information behavior, one of the main subject areas in information studies – a field that began to be studied between the two world wars and took off in the 1960s. The reader interested in information behavior studies more generally should examine two well-regarded reference works on this subject (Case and Given 2016; Fisher, Erdelez, and McKechnie 2005).

Information Study Approaches

The early research on information behavior focused on human behavior in structured information environments, such as when a person went to a library to seek information or interacted with a database. But, of course, there were other, less structured environments for finding information, such as through conversations with friends and family; consulting religious or civic leaders, or area specialists such as financial advisors; and through consumption of the media. With the coming of the Internet and portable information devices, one could seek information anywhere, anytime, on any subject profound or frivolous. Information seeking, consumption, and analysis became an increasingly everyday part of ordinary people’s lives. The field expanded over time to not only include information needs, wants, and seeking, but also information avoidance and overload, and various kinds of affective as well as cognitive responses to information.

In fact, the everyday aspects of information were studied not only by information scholars but also by sociologists, communications scholars, and media scholars beginning as early as the 1930s. These studies about the roles information plays in one’s everyday life draw upon theorizing by such scholars as Michel de Certeau (1984), Henri Lefebvre (2008/1947), Dorothy E. Smith (1987), and Carolyn Steedman (1987). For an overview of the relevant theorizing, see Highmore (2001), Bakardjieva (2005, Chs. 1 and 2), and Haythornthwaite and Wellman (2002). Highmore also includes writing selections from many of these theorists. To make this introduction to everyday information studies more manageable, we focus here on our own work and primarily on our recent edited book, Deciding Where to Live (Ocepek and Aspray 2021). For a sample of other everyday information studies, see for example the work of Denise Agosto (with Sandra Hughes-Hassell, 2005), Karen Fisher (neé Pettigrew, 1999), Tim Gorichanaz (2020), Jenna Hartel (2003), Pam McKenzie (2003), and Reijo Savolainen (2008).

Our personal involvement with research on everyday information studies began with Everyday Information (Aspray and Hayes 2011), which injected historical scholarship into studies on everyday information. In a long study of “100 Years of Car Buying”, one of us (Aspray, pp. 9 – 70 in Aspray and Hayes 2011) introduced a historical model, showing how endogenous forces (e.g., the dealership model for selling automobiles, or the introduction of foreign automobiles into the American market) and exogenous forces (e.g., war, or women entering the workforce) shaped the information questions that people were interested in and sometimes even the information sources they consulted. This volume, presenting an historical approach to everyday information behavior, included contributions by the noted historians of computing James Cortada, Nathan Ensmenger, and Jeffrey Yost.

Our collaboration began when the two of us, together with our colleague George Royer (today a game designer in Austin, TX), wrote two books about food from the perspective of information studies. We did not follow the typical approaches of food scholars, studying such topics as food pathways or food security, but instead applied the lens of information studies to this topic of wide popular interest. In the two short books that we produced, Food in the Internet Age (Aspray, Royer, Ocepek 2013) and Formal and Informal Approaches to Food Policy (Aspray, Royer, and Ocepek 2014), we discussed a wide variety of topics, such as: the online grocer Webvan (the largest loser of venture capital in the dot-com crash of 2001); the harms that Yelp, OpenTable, and Groupon created for small brick-and-mortar businesses and customers; the different ways in which the Internet has been used to represent and comment upon food and food organizations; the regulation of advertising of sweetened cereals to children; and the strategies of informal, Bully Pulpit persuasion compared to formal regulation of food and nutrition – carried out through a pair of studies: one of Eleanor and Franklin Roosevelt, and the other of Michele and Barak Obama.

This work on food, and some of our subsequent research, falls into the field of information studies. We carefully use that term instead of information science because our work is more informed by humanities (critical theory, cultural studies) and social science disciplines (sociology, psychology, organizational and management studies, work and labor studies) than by computer science, natural science, and engineering disciplines. We both have worked in information schools, part of a movement toward the interdisciplinary study of computing and information that has emerged in the past quarter century out of (1) library schools becoming more technical, (2) computer science departments becoming more interested in people and human institutions and their social impact, and (3) newly created interdisciplinary enterprises. These information schools offer a big tent for many different kinds of methods, theories, and approaches. The breadth of these studies can be seen in the wide range of papers delivered at the annual meeting of ASIST (for example, https://www.conftool.org/asist2020/index.php?page=browseSessions&path=adminSessions) and the annual "iConference" (https://ischools.org/Program). Also see the types of scholarship presented as the specialty biennial conference on "Information Seeking in Context" (ISIC, e.g., http://www.isic2018.com).

So far, there is little cross-usage of methods or approaches by people studying everyday information, e.g. by a traditional information studies scholar who studies information literacy incorporating research from data science or ubiquitous computing, but this cross-fertilization is just beginning to happen. In our own research, we do the next best thing through edited volumes which include chapters using a variety of approaches, so as to gain multiple perspectives on an issue. This is true, for example, in our book on where to live (discussed in detail below) and the book on information issues in aging (mentioned below).

The cover of the Deciding Where to Live: Information Studies on Where to Live in America Edited by Melissa G. Ocepek and William Aspray.
Figure 1: The cover of the Deciding Where to Live: Information Studies on Where to Live in America
Edited by Melissa G. Ocepek and William Aspray.


Deciding Where to Live

In our recent edited book, Deciding Where to Live, we are continuing our study of everyday phenomena through an information lens. We describe this book in some detail here to give our readers a better sense of the ways in which information studies scholars operate. All of the chapters in this book were written by people associated with leading information schools in the United States (Colorado, Illinois, Indiana, Syracuse, Texas, Washington). As with our food studies, we have taken multiple perspectives – all drawn from information studies – to investigate various aspects of housing. These studies, for example, employ work studies and business history; information, culture, and affective aspects of information issues; community studies; information behavior; and privacy.

Information scholars are often interested in the results of scholarship by labor, management, and organization scholars; and sometimes they adopt their theories and methods. These scholars are interested in such issues as the growing number of information occupations, the increased percentage of a person’s job tasks on information activities, and the ways in which tools of communication and information have changed firm strategies and industry structures. Everyday information scholars, too, are interested in these results, but primarily for what they have to say about the everyday or work lives of individuals.

The work of real estate firms, realtors, and home buyers and sellers have been profoundly changed by the massive adoption of information and communication technologies in recent years. Let us consider two chapters, by James Cortada and Steve Sawyer, from the Deciding Where to Live book. One major change in the 21st century has been the rise of websites, such as Zillow and Realtor.com, that enable individuals to access detailed information about housing without having to rely upon a realtor or the Multiple Listing Service. Using a business history approach, Cortada shows how these changes have changed the structure of the real estate industry, altered the behavior of individual firms, made the buyer and seller more informed shoppers, lowered commissions on house sales, and introduced new business models such as Zillow buying homes themselves and not just providing information about them. Some people believe that the rise of companies such as Zillow means that the imbalance between the information held by realtors and buyers is largely a thing of the past, that disintermediation by realtors is also largely over, and that the need for realtors is greatly diminished – and that we will see a radical shrinking in this occupation in the same way that the numbers of telephone operators and travel agents has plummeted. (See Yost 2008.)

Sawyer argues, however, that the work of the real estate agent is evolving rather than being eliminated. As he states his argument: “real estate agents have been able to maintain, if not further secure, their role as market intermediaries because they have shifted their attention from being information custodians to being information brokers: from providing access to explaining.” (Sawyer 2021, p. 35) As he notes, the buying of a house is a complex process, involving many different steps and many different participants (selecting the neighborhood and the particular house, inspecting the property, checking on title and transferring it, obtaining financing, remediating physical deficiencies in the property, etc.). One might say that it takes a village to sell a house in that village; and an important role of the real estate agent is to inform the buyers of the many steps in the process and use their network of specialists to help the buyers to carry out each step in a professional, timely, and cost-effective way.

Figure 2: A Zillow search page for Arvada, CO captured on April 4th, 2021
Figure 2: A Zillow search page for Arvada, CO captured on April 4th, 2021.

How do these changes affect the everyday life of the individual? There are more than 2 million active real estate agents in the United States. Their work has changed profoundly as they adapt real-estate-oriented websites and apps in their work. Even though most real estate agents work through local real estate firms, to a large degree they act largely as independent, small businesspeople who carry out much of their work from their cars and homes, as much as from their offices. So, they rely on websites and apps not only for information about individual homes, but also for lead generation, comparative market analysis, customer relationship management, tracking their business expenses such as mileage, access to virtual keys, video editing of listings, mounting marketing campaigns, and a multitude of other business functions. For those who are buyers and sellers, they can use Zillow or its online competitors to become informed buyers before ever meeting with a real estate agent, learning how much their current home is worth, figuring out how large a mortgage they can qualify for, checking out multiple potential neighborhoods not only for housing prices but also for quality of schools and crime rates, checking out photos and details of numerous candidate houses, and estimating the total cost of home ownership. Interestingly, many individuals who are not looking to buy or sell a home in the near term are regular users of Zillow. It is a way to spy on neighbors, try out possible selves, plan for one’s future, or just have a good time. In our introductory chapter, we address these issues.

Another chapter, by Philip Doty, reflects upon the American dream of the smart home. Drawing upon the scholarship in surveillance capitalism Soshanna Zuboff (2019), feminist scholarship on privacy, Anita Allen (1988), Patricia Bolling (1996), Catherine MacKinnon (1987), gender studies in history of science and technology, Ruth Cowan (1983), geography of surveillance, Lisa Makinen (2016), and other scholarly approaches, Doty reflects on the rhetorical claims about technological enthusiasm related to smart cities and smart homes, and discusses some of the privacy and in particular surveillance issues that arise in smart homes.

Information is not merely used by people in cognitive ways; it can also bring joy, sadness, anxiety, and an array of other emotions. Deciding where to live can be an exciting, fraught, and stressful experience for many people. When one is searching for a home in a particularly competitive housing market, the addition of time pressures can amp up the emotional toll of house hunting and discourage even the most excited home buyer. In her chapter, Carol Landry recounts how the high stakes decision making of home buying becomes even more complicated when time pressure and emotions come into play. Her research is based on an empirical study of home buyers in the highly competitive Seattle real estate market. The chapter describes the experience of several home buyers dealing with bidding wars that required quick decision making and many failed attempts at securing a home. The stories shared in this chapter highlight the despair and heartbreak that made continuing the home search difficult to participants described as going from enthusiastic information seekers to worn out information avoiders. This chapter highlights how internal and external factors can impact the home buying process and the information behaviors associated with it.

A competitive real estate market is but one of myriad experiences that can further complicate the process of deciding where to live. There are times in most people’s lives where the unique attributes of a life stage play an outsized role in decision-making around housing; one of these times is retirement. In Aspray’s chapter, the realities of retirement complicate the lives of individuals lucky enough to be able to retire with new considerations that shape decision making. Retirement adds new complexity to deciding where to live because the stability of work that binds many people’s lives is no longer there, creating many exciting new opportunities and constraints. Different elements shape questions around where to live for retired people including the emotional ties to their current homes, the financial realities of retirement income, and the physical limitations of aging.  

Figure 3: The cover of HGTV Magazine from January/February 2016
Figure 3: The cover of HGTV Magazine from January/February 2016.

During times of societal uncertainty, a home can be a comforting shelter that keeps the external world at bay. Even when a lot of uncertainty stems from the housing market, as it did during the Housing Crisis of 2007 and the recession that followed. As more and more people lost their homes to foreclosures or struggled to pay their mortgages, home and garden entertainment media provided a pleasant, comfortable escape for millions of Americans. Ocepek, in her chapter on home and garden sources, found that, throughout the housing crisis, recession, and recovery, home and garden sources grew or maintained their popularity with viewers and readers – likely due to the social phenomenon of cocooning or taking shelter in one’s space when the world outside becomes uncertain and scary. Both home and garden magazines and HGTV made changes to their content to represent the new home realities of many of their readers and viewers, but they also largely stayed the same, presenting comforting content about making whatever space you call home the most comfortable.

The financial hardships throughout the housing crisis, recession, and recovery were not experienced by all Americans in equal measure. Several authors in the book presented multiple examples where housing policies, economic conditions, and social unrest disproportionately affected marginalized communities throughout the United States. One is Pintar’s chapter about Milwaukee, mentioned below. Although some of the legal frameworks built to segregate cities and communities throughout the country have changed, the experience of deciding where to live for Black and African Americans adds additional layers of complexity to the already complicated process. Drawing on critical race theory, Jamillah Gabriel delineates how Black and African American house searchers (renters and buyers) create information seeking and search strategies to overcome the historic and contemporary discriminatory policies and practices of housing segregation. The chapter analyzes specialized information sources the provide useful information to help this group of house searchers find safer communities where they have the greatest chance to prosper. These sources include lists of the best and worst places for African American and Black individuals and families to live. The lists draw on research the compares communities based on schools, employment, entertainment, cost of living, housing market, quality of life, and diversity. Drawing on historic and contemporary account, the analysis provided in this chapter highlights that, “the housing industry can be a field of land mines for African American in search of home” (Gabriel 2021, p. 274).

Figure 4: Home Owner’s Loan Corporation Map of Milwaukee, Wisconsin, 1938, National Archives; image retrieved from Mapping Inequality, University of Richmond, https://dsl.richmond.edu/panorama/redlining/#loc=11/43.03/-88.196&city=milwaukee-co.-wi
Figure 4: Home Owner’s Loan Corporation Map of Milwaukee, Wisconsin, 1938, National Archives; image retrieved from Mapping Inequality, University of Richmond, https://dsl.richmond.edu/panorama/redlining/#loc=11/43.03/-88.196&city=milwaukee-co.-wi

It is often said that information and information tools are neither inherently good or bad, but that they can be used for both good and bad purposes. Two chapters in the book illustrate this point. In a study of the city of Milwaukee, Judith Pintar shows how HOLC maps, which were created to assess the stability of neighborhoods, were used to reinforce the racist practice of redlining. In another chapter, by Hannah Weber, Vaughan Nagy, Janghee Cho, and William Aspray, the authors show how information tools were used by the city of Arvada, Colorado and various groups (such as builders, realtors, parents, activists, and the town council) to improve the city’s quality of life in the face of rapid growth and its attendant issues such as traffic problems, rising housing prices, the need to build on polluted land, and the desire to protect the traditional look and feel of this small town. A third chapter, by David Hopping, showed how an experiment in Illinois was able to repurpose military housing for non-military purposes for the social good. His empirical study is seen through the lens of the theoretical constructs of heterotopia (Foucault 1970), boundary objects (Star and Griesmer 1989), and pattern language (Alexander 1977).

Conclusions

Both of us are continuing to pursue work on everyday information issues. One (Aspray) is continuing this work on information studies in everyday life, through an edited book currently in progress on information issues related to older Americans (Aspray, forthcoming in 2022). This book ranges from traditional Library and Information Science approaches about health information literacy on insurance for older Americans, the variety of information provided by AARP and its competitors, and the use of information and communication technologies to improve life in elderly communities; to more technologically oriented studies on ubiquitous computing, human-computer interaction, and Internet of Things for older people. Meanwhile, Ocepek is building on her work in her doctoral dissertation (Ocepek 2016), which examined from both social science and cultural approaches the everyday activity of grocery shopping. Her new study is examining what has happened to grocery shopping during the pandemic.

We are pleased to see the broadening in mission of the Babbage Institute to consider not only the history of computing but also the history and cultural study of information. For example, many scholars (including some computer historians) since 2016 have been studying misinformation. (See, for example, Cortada and Aspray 2019; Aspray and Cortada 2019.) This study of everyday information is another way in which the Babbage Institute can carry out its broadened mission today.

In particular, there are a few lessons for computer historians that can be drawn from the scholarship we have discussed here, although many readers of this journal may already be familiar with and practicing them:

  • One can study information as well as information technology. On the history of information, see for example Blair (2010), Headrick (2000), Cortada, (2016), and Ann Blair et al. (2021). For a review of this scholarship, see Aspray (2015).
  • One can study everyday uses of information and information technology, even if they may be regarded by some as quotidian – expensive, complex, socially critical systems are not the only kinds of topics involving information technology that are worth studying.
  • This past year has taught all of us how an exogenous force, the COVID-19 pandemic, can quickly and radically reshape our everyday lives. In the opening chapter of our book, we briefly discuss the earliest changes the pandemic brought to real estate. We are also seeing the grocery industry as well as the millions of consumers learning, adapting, and changing their information behaviors around safely acquiring food.
  • In order to study both historical and contemporary issues about information and information technology, one can blend historical methods with other methods from computer science (e.g., human-computer interaction, data science), social science (qualitative and quantitative approaches from sociology, psychology, economics, and geography), applied social science (labor studies, management and organization studies), and the humanities disciplines (cultural studies, critical theory).

These are exciting times for the historians of computing and information!


Bibliography

Agosto, Denise E. and Sandra Hughes-Hassell. (2005). "People, places, and Questions: An Investigation of the Everyday Life Information-Seeking Behaviors of Urban Young Adults." Library & Information Science Research, vol. 27, no. 2, pp. 141-163.

Alexander, Christopher. (1977). A Pattern Language. Oxford University Press.

Allen, Anita L. (1988). Uneasy Access: Privacy for Women in a Free Society. Rowman & Littlefield.

Aspray, William. (2015). The Many Histories of Information. Information & Culture, 50.1: 1-23.

Aspray, William. (forthcoming 2022). Information Issues for Older Americans. Rowman & Littlefield.

Aspray, William and James Cortada. (2019). From Urban Legends to Political Factchecking. Springer.

Aspray, William and Barbara M. Hayes. (2011). Everyday Information. MIT Press.

Aspray, William, George W. Royer, and Melissa G. Ocepek. (2013). Food in the Internet Age. Springer. 

Aspray, William, George W. Royer, and Melissa G. Ocepek. (2014). Formal and Informal Approaches to Food Policy. Springer.

Bakardjieva, Maria. (2005). Internet Society: The Internet in Everyday Life. Sage.

Blair, Ann. (2010). Too Much to Know. Yale.

Blair, Ann, Paul Duguid, Anja Silvia-Goeing, and Anthony Grafton, eds. (2021). Information: A Historical Companion. Princeton.

Boling, Patricia. (1996). Privacy and the Politics of Intimate Life. Cornell University Press.

Case, Donald O. and Lisa M. Given. (2016). Looking for Information. 4th ed. Emerald.

Cortada, James and William Aspray. (2019). Fake News Nation. Rowman & Littlefield.

Cowan, Ruth Schwartz. (1983). More Work for Mother. Basic Books.

De Certeau, Michel (1984). The Practice of Everyday Life. Translated by Steven F. Rendall. University of California Press. 

Fisher, Karen E. Sandra Erdelez, and Lynne McKechnie. (2009). Theories of Information Behavior. Information Today.

Foucault, Michel. (1970). The Order of Things. Routledge.

Gorichanaz, Tim (2020). Information Experience in Theory and Design. Emerald Publishing.

Hartel, Jenna. (2003). "The Serious Leisure Frontier in Library and Information Science: Hobby Domains." Knowledge Organization, vol. 30, No. 3-4, pp. 228-238.

Haythornthwaite, Caroline and Barry Wellman, eds. (2002). The Internet in Everyday Life. Wiley-Blackwell.

Headrick, Daniel. (2000). When Information Came of Age. Oxford.

Highmore, Ben ed. (2001). The Everyday Life Reader. Routledge.

Lefebvre, Henri. (2008). Critique of Everyday Life. vol. 1, 2nd ed. Translated by John Moore. Verso.

MacKinnon, Catherine. (1987). Feminism Unmodified. Harvard University Press.

Makinen, Lisa A. (2016). "Surveillance On/Off: Examining Home Surveillance Systems from the User’s Perspective." Surveillance & Society, 14.

McKenzie, Pamela J. (2003). "A Model of Information Practices in Accounts of Everyday‐Life Information Seeking." Journal of Documentation, vol. 59, no. 1, pp. 19-40.

Pettigrew, Karen E. (1999). "Waiting for Chiropody: Contextual Results from an Ethnographic Study of the Information Behaviour Among Attendees at Community Clinics." Information Processing & Management. vol. 35, no. 6, pp. 801-817.

Ocepek, Melissa G. (2016). "Everyday Shopping: An Exploration of the Information Behaviors of the Grocery Shoppers." Ph.D Dissertation, School. Of Information, University of Texas at Austin.

Ocepek, Melissa G. and William Aspray, eds. (2021). Deciding Where to Live. Rowman & Littlefield.

Savolainen, Reijo. (2008). Everyday Information Practices: A Social Phenomenological Perspective. Scarecrow Press. 

Smith, Dorothy E. (1987). The Everyday World as Problematic: A Feminist Sociology. Northeastern University Press. 

Star, Susan Leigh and James R. Griesemer. (1989). "Institutional Ecology, Translations, and Boundary Objects: Amateurs and Professionals in Berkeley’s Museum of Vertebrate Zoology, 1907-39." Social Studies of Science 19, 3: 387-420.

Steedman, Carolyn. (1987). Landscape for a Good Woman: A Story of Two Lives. Rutgers University Press. 

Yost, Jeffrey R. (2008). “Internet Challenges for Nonmedia Industries, Firms, and Workers.” pp. 315-350 in William Aspray and Paul Ceruzzi, eds., The Internet and American Business. MIT Press.

Zuboff, Shoshanna. (2019). The Age of Surveillance Capitalism. Public Affairs.

 

Aspray, William and Ocepek, Melissa G. (April 2021). “Everyday Information Studies: The Case of Deciding Where to Live." Interfaces: Essays and Reviews in Computing and Culture Vol. 2, Charles Babbage Institute, University of Minnesota, 27-37.


About the authors:

Melissa G. Ocepek is an Assistant Professor at the University of Illinois Urbana-Champaign in the School of Information Sciences. Her research draws on ethnographic methods and institutional ethnography to explore how individuals use information in their everyday lives. Her research interests include everyday information behavior, critical theory, and food. Recently, she co-edited Deciding Where to Live (Rowman & Littlefield, 2021) with William Aspray. Previously she published two books that address the intersection of food, information, and culture: Food in the Internet Age and Formal and Informal Approaches to Food Policy (both with William Aspray and George Royer). Dr. Ocepek received her Ph.D. at the University of Texas at Austin in the School of Information.

William Aspray is Senior Research Fellow at CBI. He formerly taught in the information schools at Indiana, Texas, and Colorado; and served as a senior administrator at CBI, the IEEE History Center, and Computing Research Association. He is the co-editor with Melissa Ocepek of Deciding Where to Live (Rowman & Littlefield, 2021). Other recent publications include Computing and the National Science Foundation (ACM Books, 2019, with Peter Freeman and W. Richards Adrion); and Fake News Nation and From Urban Legends to Political Fact-Checking (both with James Cortada in 2019, published by Rowman & Littlefield and Springer, respectively). 


 

Of Mice and Mentalité: PARC Ways to Exploring HCI, AI, Augmentation and Symbiosis, and Categorization and Control

Jeffrey R. Yost, Charles Babbage Institute, University of Minnesota

Abstract: This think piece essay comparatively explores history and mindsets with human-computer interaction (HCI) and artificial intelligence (AI)/Machine Learning (ML). It draws on oral history and archival and other research to reflect on the institutional, and cultural and intellectual history of HCI (especially the Card, Moran, and Newell team at Xerox PARC) and AI. It posits the HCI mindset (focused on augmentation and human-machine symbiosis, as well iterative maintenance) could be a useful framing to rethink dominant design and operational paradigms in AI/ML that commonly spawn, reinforce, and accelerate algorithmic biases and societal inequality.

(PDF version available for download.)

 

First mouse
First Computer Mouse prototype designed and developed by Douglas Engelbart, William English, and their ARC Team at SRI. SRI International, CC BY-SA 3.0 <https://creativecommons.org/licenses/by-sa/3.0>, via Wikimedia Commons

This essay briefly recounts the 1982 professional organizational founding for the field of Human-Computer Interaction (HCI) before reflecting on two decades prior in interactive computing—HCI’s prehistory/early history—and its trajectory since. It comparatively explores history and mindsets with HCI and artificial intelligence (AI). For both HCI and AI, “knowing users” is a common target, but also a point of divergent departure.

For AI—especially large-scale, deployed systems in defense, search, and social networking—knowing users tends to involve surveillance, data collection, and analytics to categorize and control in the service of capital and power. Even when aims are purer, algorithmic biases frequently extend from societal biases. Machines can be programed to discriminate or learn it from data and data practices.

For HCI—from idealistic 1960s beginnings through 1980s professionalization and beyond—augmenting users and human-machine symbiosis has been its core. While an HCI-type mindset offers no magic bullet to AI’s ills, this essay posits that it can be a useful framing, a reminder toward proper maintenance, stewardship, and structuring of data, design, code (software), and codes (legal, policy, and cultural). HCI systems, of course, can be ill designed, perform in unforeseen ways, or users can misapply them, but this likely is less common and certainly is of lesser impactful scale relative to AI. Historians and sociologists must research the vast topics of AI and HCI more fully in many contexts and settings.

HCI and Solidifying the Spirit of Gaithersburg

In mid-March 1982 IIT Programming Technology Center’s Bill Curtis and University of Maryland’s Ben Shneiderman held the first “Human Factors in Computing Systems” conference in Gaithersburg, Maryland. The inspiring event far exceeded the organizers’ expectations, attracting more than 900 attendees. It was the pivotal leap forward in professionalizing HCI. 

Rich program content filled the three-day program, as impactful organizational work occurred at an evening, small group side meeting. At the latter, Shneiderman, Curtis, UCSD’s Don Norman, Honeywell’s Susan Dray, Northwestern’s Loraine Borman, Xerox PARC's (Palo Alto Research Center) Stuart Card and Tom Moran, and others strategized about HCI’s future and possibilities for forming an association within a parent organization. Borman, an information retrieval specialist in a leadership role at ACM SIGSOC (Social and Behavioral Computing), and Shneiderman, a computer scientist, favored the Association for Computing Machinery (ACM). Insightfully seeing an expedient workaround, Borman proposed SIGSOC transform—new name/mission—bypassing the need for a new SIG approval.

The Design of Everyday Things book cover

 

Cognitive scientist Don Norman questioned whether ACM should be the home, believing computer science (CS) might dominate. After debate, Shneiderman and Borman’s idea prevailed. Dray recalls, the sentiment was “we can’t let the spirit of Gaithersburg die,” and for most, SIGSOC’s metamorphous seemed a good strategy (Dray 2020). Borman orchestrated transforming SIGSOC into SIGCHI (Computer-Human Interaction). The CHI tail essentially became the dog (SOC’s shrinking base mainly fit under HCI’s umbrella). Interestingly, “Computer” comes first in the acronym, but likely just to achieve a pronounceable word in the ACM SIG style, as “HCI” appeared widely in early CHI papers (SIGCHI’s annual conference).

Norman’s concerns proved prescient. SIGCHI steadily grew reaching over 2,000 attendees by the 1990 Seattle CHI, but in its first decade, it principally furthered CS research and researchers. Scholarly standards rose, acceptance rates fell, and some practitioners felt crowded out. In 1991, practitioners formed their own society, User Experience Professional Association (UXPA). In the 1990s and beyond, SIGCHI blossomed into an increasingly (academic) discipline diverse organization.

As with all fields/subfields, HCI has a prehistory or an earlier less organizationally defined history (for HCI, the 1960s and 1970s). SIGCHI’s origin lay in the confluence of: past work in human factors; university “centers of excellence” in interactive computing created through 1960s Advanced Research Projects Agency (ARPA) Information Processing Techniques Office (IPTO) support; two particularly impactful laboratories (PARC and SRI’s ARC); Systems Group artists in the UK; and the promise of Graphical User Interface (GUI) personal computers (PCs).

Nonprofit corporation SRI’s Augmentation Research Center (ARC), and Xerox’s PARC were at the forefront of GUI and computer mouse developments in the 1970s and 1980s. Neither the GUI nor mouse R&D were secret at PARC; in the 1970s, many visitors saw Alto demos, including, in 1979, Steve Jobs/Apple Computer team. In 1980 Apple hired away PARC’s Larry Tesler and others. Jobs launched the Apple Lisa effort (completed in 1983, priced at $10,000), which like the even more expensive Xerox Star (1981), possessed a GUI and mouse. The 1984 Apple Macintosh, retailing at $2,500, initiated an early mass market for GUI personal computers—inspiring initiators, most notably, Microsoft Windows 2.0 in 1987.

In early 2020, I conducted in-person oral history interviews with three of HCI’s foremost intellectual and organizational pioneers—the pilot for a continuing ACM/CBI project. This included UCSD Professor Don Norman (SIGCH Lifetime Research Awardee; Benjamin Franklin Medalist), Xerox PARC Scientist and Stanford Professor Stuart Card (SIGCHI Lifetime Research Awardee; National Academy of Engineering), and Dr. Susan Dray (SIGCHI Lifetime Practice Awardee; UXPA Lifetime Achievement Awardee).

Don Norman is well-known both within and outside CS—extending from his 1988 book The Psychology of Everyday Things (POET), re-released as wide selling, The Design of Everyday Things. A student of Duncan Luce (University of Pennsylvania), he was among the first doctorates in mathematical psychology. Early in his career, he joined the UCSD Psychology Department as an associate professor. After stints at Apple and Hewlett-Packard, and at Northwestern, he returned to lead the UCSD Design Laboratory. Norman helped take design from its hallowed ground of aesthetics to establish it in science, and greatly advanced understanding and practice of usability engineering.

Susan Dray & Norman's POET
HCI Scientist/Entrepreneur Susan Dray (left) and Norman's POET. 

Norman stressed to me that there is one scientist so consistently insightful he never misses his talks at events he attends, PARC’s Stuart Card. Card was the top doctoral student of Carnegie Mellon Professor of Cognitive Psychology and Computer Science Allen Newell. While these two interviews were in California, my interview with Dr. Susan Dray was in Minneapolis, with the scientist who pioneered the first corporate usability laboratory outside the computer industry (IBM and DEC had ones) at American Express Financial Advisors (AEFA).

Dray took a different path after her doctorate in psychology from UCLA—human factors—on classified Honeywell Department of Defense (DoD) projects. In the early 1980s, Honeywell, a pioneering firm in control systems, computers, and defense-contracting, had a problem with ill-adapted computing in its headquarters for clerical staff, which Dray evaluated. This became path defining for her career, toward computer usability. After pioneering HCI work at Honeywell, Dray left for American Express, and later became a successful and impactful HCI consultant/entrepreneur. She applied observations, ethnographic interviewing, and the science of design to improve interaction, processes, and human-machine symbiosis in cultures globally, from the U.S., South Africa, Egypt and Jordan to India, Panama, and France.

Earlier, in the late 1980s, at American Express, Dray was seeking funds for a usability lab, and she creatively engaged in surreptitious user feedback. She bought a “carton” of Don Norman’s POET book, had copies delivered to all AEFA senior executives on the top/29th floor, and rode up and down the elevator starting at 6 am for a couple hours each morning for weeks, listening to conversations concerning this mysteriously distributed book on the science of design. Well-informed, she pitched successfully, gaining approval for her usability lab.

The Norman, Card, and Dray oral histories, another HCI interview I just conducted, with artist Dr. Ernest Edmonds, my prior interview with Turing Awardee Butler Lampson of Alto fame, preparation for these five interviews, and AI and HCI research at the CBI, MIT, and Stanford University archives inform this essay.

For AI and HCI, Is There a Season?

Microsoft Research Senior Scientist, Jonathan Grudin—in his valuable From Tool to Partner (2017) on HCI’s history—includes a provocative argument that HCI thrives during AI Winters and suffers during AI’s other seasons. The usefulness of the widespread Winter metaphor is debatable, it is based on changing funding levels to elite schools (Mendon-Plasek, 2021 p. 55), but Grudin’s larger point—only one of the two fields thrives at a time—hints to a larger truth: HCI and AI have major differences. The fields overlap with some scientists and some common work but have distinct mindsets. Ironically, AI, once believed to be long on promises and short on deliveries (the rationalized basis for AI Winters), is now delivering stronger, and likely, more harmfully than ever given algorithmic and data biases in far reaching corporate and government systems.

Learning How Machines Learn Bias

Increasingly more and more of our devices are “smart,” a distracting euphemism obscuring how AI (in ever-increasingly interconnected sensor/IoT/cloud/analytics systems) reinforces and extends biases based on race, ethnicity, gender, sexuality, and disability. Recent interdisciplinary scholarship is exposing roots of discriminatory code (algorithms/software) and codes (laws, policy, culture), including deeply insightful keynotes at the Charles Babbage Institute’s (CBI) “Just Code” Symposium (a virtual, major event with 345 attendees in October 2020) by Stephanie Dick, Ya-Wen Lei, Kenneth Lipartito, Josh Lauer, and Theodora Dryer. Their work contributes to an important conversation also extended in important scholarship by Ruha Benjamin, Safiya Noble, Matt Jones, Charlton McIlwain, Danielle Allen, Jennifer Light (MIT; and CBI Sr. Research Fellow), Mar Hicks, Virginia Eubanks, Lauren Klein, Catherine D’Ignazio, Amanda Menking, Aaron Mendon-Plasek (Columbia; and current CBI Tomash Fellow), and others.

AI did not merely evolve from a benevolent past to a malevolent present. Rather, it has been used for a range of different purposes at different times. Geometrically expanding the number of transistors on chips—the (partially) manufactured and manufacturing/fabrication trajectory of Moore’s Law—enabled computers and AI to become increasingly powerful and pervasive. Jennifer Light’s insightful scholarship on the RAND Corporation’s 1950s and 1960s operations research, systems engineering, and AI, created in the defense community, and later misapplied to social welfare, counters notions of an early benevolent age. Even if chess is the drosophila of AI—a phrase of John McCarthy’s from the 1990s—its six-decade history is one of consequential games, power contests. Work in computer rooms at the Pentagon’s basement and at RAND harmfully escalated Cold War policies as DoD/contractors simulated and supported notions of the U.S. rapidly “winning” the Vietnam War, and earlier, C-E-I-R (founded by ex-RAND scientists) used input/output-economics algorithmic systems to determine optimal bomb targets to decimate the Soviet Union industrially (Yost, 2017).

What helped pulled AI out of its first long (1970s) Winter was successes and momentum with expert systems—the pioneering work of Turing Awardee Stanford AI scientist Edward Feigenbaum and molecular biologist and Nobel Laureate Joshua Lederberg’s late 1960s Dendral, to advance organic chemistry, and Feigenbaum and others’ early 1970s MYCIN in medical diagnostics and therapeutics. These AI scientific triumphs stood out and lent momentum for expert systems, as did fears of Japan’s Fifth Generation (early 1980s—government and industry partnership in AI/systems). In the 1980s, elite US CS departments again received strong federal support for AI. Work in expert systems in science, medicine, warfare, and computer intrusion detection abounded (Yost, 2016).

Some AI systems are born biased; others learn it—from algorithmic tweaks to expert system inference engines to biased data. Algorithmic bias is just one of the many problematic byproducts of valuing innovation over maintenance (Vinsel and Russell 2020, Yost 2017).

Human Factors and Ergonomics

The pre-history/early history of human-machine interaction dates back many decades to the control of workers and soldiers to maximize efficiency. The late-1950s-spawned Human Factors Engineering Society grew out late inter-war period organizational work of the Southern California aerospace industry. In the first half of the 20th century, human factors had meaningful roots in the scientific management thought, writings, and consulting of Frederick Winslow Taylor. This tradition defined the worker as an interchangeable part, a cog within the forces of production to efficiently serve capital. At Taylorist-inspired and organized factories, management oppressed laborers, and human factors has a mixed record in its targets, ethics, and outcomes. However, in HCI’s organizational start, early 1980s, the mantra was not merely of efficiency; it was the frequently uttered, “know the user.” This, importantly, was a setting of personal computing and GUI idealism, a trajectory insightfully explored by Stanford’s Fred Turner in From Counterculture to Cyberculture.

Xerox Palo Alto Research Center (PARC)
Image of the Xerox Palo Alto Research Center (PARC) in 1977, in the 1970s and 1980s PARC had an incredible team of some of the world’s top computer scientists. [By Dicklyon- Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=95847361]

We’re on a Road to Intertwingularity, Come on Inside

Years before the National Science Foundation (NSF) took the baton to be the leading federal funder of basic CS research at universities, ARPA’s IPTO (following 1962 founding director’s J.C.R. Licklider’s vision), changed the face of computing toward interaction.  Well known philosopher and sociologist Ted Nelson, a significant HCI contributor of the 1960s and 1970s, creatively coined the term “intertwingularity” of the symbiosis and all being intertwined or connected (networking, text through his term/concept of “hypertext,” human user with interactive computing)—it can aptly describe the multifaceted HCI work of 1960’s IPTO-funded SRI’s ARC and 1970s Xerox PARC.

The 1970-enacted Mansfield Amendment required direct and defined DoD function for all DoD research funding. It left a federal funding vacuum for years until NSF could ramp up to become a roughly comparable basic funder for the interactive computing that IPTO started. The vacuum, however, was largely filled by a short golden age of corporate industrial research in interactive computing at Xerox, a firm with a capital war chest, much dry powder, from its past photocopier patent-based monopoly, and seeking to develop the new, new thing(s). Xerox looked to its 1970-launched PARC to invent the office of the future. It hired many previously IPTO-supported academic computer scientists, it produced and employed a cadre of Turing Awardees, an unprecedented team far exceeding any single university’s CS department in talent or resources.

Inside the PARC Homeruns

Douglas Engelbart and the earliest work on the first mouse designed by him and SRI’s Bill English is addressed by French Sociologist Thierry Bardini in Bootstrapping, a biography of Engelbart. Journalists, such as Michael Hiltzik, have covered some major contours of technical innovation at PARC.

Central to Bardini’s and Hiltzik’s and others’ narratives is the important HCI work of Turing Awardees Douglas Engelbart at SRI; and Butler Lampson, Alan Kay, Charles Thacker, and Charles Simonyi at PARC. In this essay I look beyond oft-told stories and famed historical actors in GUIs and mice to briefly discuss a hitherto largely overlooked, highly impressive small PARC research team composed of Newell, Card, and Moran, and a larger team that Card later led. The incredible accomplishments of Lampson and others changed the world with the GUI. They hit the ball out park, so to speak—"a shot heard round the world” (1951 Bobby Thompson Polo Grounds, Don DeLillo immortalized, homerun sense) that very visibly revolutionized interactive computing.

Newell is one of the most famous of the first-generation AI scientists, a principal figure at John McCarthy’s famed Dartmouth Summer 1956 Workshop, in which McCarthy, Newell, Herbert Simon, Marvin Minsky, and others founded and gave name to the field—building upon earlier work of Alan Turing. On a project launched in 1955, Newell, as lead, co-invented (with Simon and Clifford Shaw) “The Logic Theorist” in 1956, the first engineered, automated logic or AI program. Many historian and STS colleagues I have spoken with associate Newell solely with AI, and they are unaware of his PARC HCI work. Unlike Turing and Simon, Newell does not have a major biography documenting the full breadth of his work. Newell’s HCI research has been neglected by historians, as has that of his two top students, Card and Moran. They published many seminal HCI papers in Communications of the ACM and other top journals.

This oversight (by historians, they were revered by fellow scientists), especially neglecting career long contributions of Card and Moran, is a myopic favoring of first-recognized invention over subsequent ones, missing key innovations, and devaluing maintenance. It was not merely the dormouse (mouse co-inventors Engelbart and English, the recognized revolution), but multiple dormice (the science and engineering behind optimizing mice for users). Remember(ing) what the dormice said (and with an open ear of historical research), Card and Moran clearly conducted brilliant scientific research spawning many quiet revolutions.

Stuart Card
Xerox PARC’s and Stanford University’s Stuart Card

Rookie Card to All-Star Card, Pioneering HCI Scientist Stuart Card

Stuart Card was first author of a classic textbook, Psychology of Human-Computer Interaction, with co-authors Newell and Moran. Card progressed through various research staff grades and in 1986 became a PARC Senior Research Scientist. Two years later, he became Team Leader of PARC’s User Interface Research Group. The breadth and contributions of Card and PARC’s HCI research in the 1970s to 1990s is wide in both theory and practice. The work fell into three broad categories: HCI Models, Information Visualization, and Information Retrieval—and major contributions in each is breathtaking. One early contribution in HCI models was Card and the team’s work on the mouse and its performance by an information-theoretical model of motor movement, Fitts’ Law, using a processing rate parameter of 10 bits/sec, roughly at the same performance ability as the hand, demonstrating performance was not limited by the device/mouse in terms of speed, but by the hand itself. It proved a mouse was optimized to interact with humans. This impacted the development of the Xerox Star mouse in 1981 and the earliest computer mice developed by Apple Computer. Card’s, and his team’s, work was equally profound on information visualization, in areas such as Attentive-Reactive Visualizer and Visualizer Transfer Functions. In information retrieval, they did advanced Information Foraging Theory.

While staying at PARC for decades, Card concurrently served as a Stanford University Professor of Computer Science. He became a central contributor to SIGCHI and was tremendously influential to academic, industrial, and government scientists.

In listening to Card’s interview responses (and deeply influenced by my Norman, Dray, and Butler Lampson interviews also, as well as by my past research), I reflected that many AI scientists could learn much from such a mindset of valuing users, all users—knowing users to help augment, for symbiosis, not to control. AI scientists, especially on large scale systems in corporations and government (much ethical AI research is done at universities), could benefit in not merely technical ways, as Steve Jobs and others did from their day in the PARC, but from Card and his team’s ethos and ethics.

Professionalizing HCI: Latent Locomotion to Blissful Brownian Motion

While SIGCHI unintentionally pushed out many non-scientists in the 1980s, it and the HCI field shed strictly a computer science and cognitive science focus to become ever more inclusive of a wide variety of academic scientists, engineers, social scientists, humanities scholars, artists, and others from the 1990s forward. CHI grew from about 1,000 at the first events in Gaithersburg and Boston to more than 3,600 attendees at some recent annual CHI meetings (and SIGCHI now has more than two-dozen smaller conferences annually). The SIGCHI/CHI programs and researchers are constantly evolving and exploring varying creative paths that from a 30,000-foot vantage might seem to be many random walks, Brownian motion. The research, designing to better serve users, contributes to many important trajectories. The diversity of disciplines and approaches can make communication more challenging, but also more rewarding, and to a high degree a Galison-like trading zone exist in interdisciplinary SIGCHI and HCI.

One example is the Creativity and Cognition Conference co-founded by artists/HCI scientists Ernest Edmonds and Linda Candy in 1993 that became a SIGCHI event in 1997. It brings together artists, scientists, engineers, and social scientists to share research/work on human-computer interaction in art and systems design. As Edmonds related to me, communication and trust between artists and scientists takes time, but is immensely valuable. Edmonds is an unparalleled figure in computer generative and interactive art, and a core member of the Systems Group of principally UK computer generative artists. In addition to many prestigious art exhibitions in the 1970s (and beyond), Edmonds published on adaptive software development, with critique of the waterfall method. His work—in General Systems in 1974—anticipated and helped to define adaptive techniques, later referred to as agile development. Edmonds, through his artist, logician, and computer science lenses insightfully saw interactive and iterative processes, a new paradigm in programming technique, art, and other design.

HCI research, and its applications, certainly is not always in line with societal good, but it has an idealistic foundation and values diversity and interdisciplinarity. Historians still are in the early innings of HCI research. Elizabeth Petrick has done particularly insightful scholarship on HCI and disability (2015).

Coding and Codifying, Fast and Slow

Nobel Laureate Daniel Kahneman has published ideas on human cognition that are potentially useful to ponder with regard to AI and HCI. Kahneman studies decision-making, and judgement, and how different aspects of these arise from how we think—both fast, emotionally, unconsciously, and instinctively; and slow, more deeply and analytically.

Programming projects for applications and implementation of systems are often behind schedule and over-budget. Code, whether newly developed or recycled, often is applied without an ethical evaluation of its inherent biases.

HCI often involves multiple iterations with users, usability labs, observation in various settings, ethnographic interviewing, and an effective blend of both inspiring emotional-response, fast thinking, and especially, deep reflective slow thinking. This slow and analytical thinking and iterative programming (especially maintenance, and endless debugging) could potentially be helpful in beginning to uproot underlying algorithmic biases. Meanwhile, slow, and careful reflection on how IT laws, practices, policies, culture, and data are codified is instructive. All of this involves ethically interrogating the what, how, why, and by and for whom of innovation, and valuing maintenance labor and processes, not shortchanging maintenance in budget, respect, or compensation.

Beyond “Laws” to Local Knowledge

In 1967 computer scientist, Melvin Conway, noted (what became christened) Conway’s Law—computer architecture reflects the communication structure of the underlying organization where it was developed (made famous by Tracy Kidder’s The Soul of a New Machine). Like Moore’s Law, Conway’s Law is really an observation, and a self-fulfilling prophecy. Better understanding and combatting biases at the macro is critical. Also essential is evaluation and action at the local and organizational levels. How does organizational culture structure algorithms/code? What organizational policies give rise to what types of code? What do (end) users, including and especially marginalized individuals and groups, have to say on bias? How do decisions at the organizational level reinforce AI/ML algorithmic and data biases, and reinforce and accelerate societal inequality? These are vital questions to consider through many future detailed cases studies in settings globally. The goal should not be a new “law,” but rather a journey to gain local knowledge and learn how historical, anthropological, and sociological cases inform on code and codes toward policies, designs, maintenance, and structures that are more equitable.

“Why Not Phone Up Robinhood and Ask Him for Some Wealth Distribution”

The lyric above from the 1978 reggae song “White Man in a Hammersmith Palais,” by The Clash, might be updated to why not open a Robinhood app… (at least until it suspended trading). How historians later assess the so-called Robinhood/Reddit “Revolution” a transfer of $20 billion away from hedge funds/banks/asset managers over several weeks in early 2021 (punishing bearish GameStop shorting by bidding up shares to force short covering), remains to be seen. Is it a social movement, and of what demographic makeup and type? For many, it likely, at least in part, is a stand against Wall Street, and thus Zuccotti Park comparisons seem apropos. Eighty percent of stock trading volume is automated—algorithmic/programmed (AI/ML)—contributing to why a 2021 CNBC poll showed 64 percent of Americans believe Wall Street is rigged. Like capitalism, equities markets and computers combine as a potent wealth concentrating machine—one turbocharged in pandemic times and fueled by accommodative monetary policy. “Smart” systems/platforms in finance, education, health, and policing all accelerated longstanding wealth, health and incarceration gaps and divergences to hitherto unseen levels. Not to dismiss volatility or financial risk to the Reddit “revolutionaries,” but the swiftness of regulatory calls by powerful leaders is telling. It begs questions on priorities: regulation for who, of what, when, and why? U.S. IT giants using AI to surveille, and to dominate with anti-competitive practices has gone largely unregulated (as has fintech) for years. Given differential surveillance, Blacks, Indigenous, People of Color (BIPOC) suffer differentially. The U.S. woefully lags Europe on privacy protections and personal data corporate taxes. U.S. racial violence/murders by police disgracefully dwarfs other democratic nations, and America stands out for Its (police and courts) embracement of racially biased facial recognition technology (FRT) and recidivist predictive AI—such as Clearview FRT and Northpointe’s (now Equivant) Corrective Offender Management Profiling for Alternative Sanctions (COMPAS).         

Meanwhile parallel Chinese IT giants Baidu, Alibaba, and Tencent, dominant in search, e-commerce, and social networking respectively, use intrusive AI. These firms (fostered by the government), ironically, are also contributing to platforms enabling a “contentious public sphere.” (Lei 2017).

At times, users can appropriate digital computing tools against the powerful in unforeseen ways. Such historical agency is critical to document and analyze. History informs us that AI/ML, like many technologies, left unchecked by laws, regulations, and ethical scrutiny will continue to be powerfully accelerating tools of oppression.

Raging Against Machines That Learn

U.S. headquartered AI-based IT corporate giants’ record on data and analytics policy and practices have garnered increasing levels of critique by journalists, academics, legislators, activists, and others. The New York Times has reported on clamp downs on employees expressing themselves on social and ethical issues. The co-leader of Google’s Ethical AI Group Timnit Gebru tweeted in late 2020 she was fired for sending an email encouraging minority hiring and drawing attention to bias in artificial intelligence. Her email included, “Your life starts getting worse when you start advocating for underrepresented people. You start making the other leaders upset.” (Metz and Wakabayashi 2020).

On June 30, 2020, U.S. Senators Robert Menendez, Mazie Hirono, and Mark Warner wrote Facebook CEO Mark Zuckerberg critiquing his company for failing to “rid itself of white supremacist and other extremist content.” (Durkee 2020). A subsequent Facebook internal audit called for better AI—a tech fix. Deep into 2019 Zuckerberg (with a lack of clarity, as at Georgetown in October 2019) sought to defend Facebook’s policies on the basis of free speech. More concerning than his inability to execute free speech arguments is the lack of transparency and the power wielded by a platform with 2.5 billion users, it has immense power to subvert democracy and to differentially harm. It has a clear record of profits over principles. In mid-2020 The Color of Change, NAACP, National Hispanic Media Coalition and others launched the “Stop Hate for Profit” boycott on Facebook advertising for July 2020, more than 1,200 organizations participated. Pivoting PR in changing political winds, Zuckerberg is seeking to shift responsibility to Congress asking it to regulate (Facebook’s legal team likely will defend the bottom line).

Data for Black Lives, led by Executive Director Yeshimabeit Milner, is an organization and movement of activists and mathematicians. It focuses on fighting for possibilities for data use to address societal problems and fighting against injustices, stressing “discrimination is a high-tech enterprise.” It recently launched, Abolish Big Data, “a call to action to reject the concentration of Big Data in the hands of the few, to challenge the structures that allow data to be wielded as a weapon…” (www.d4bl.org). This organization is an exemplar of vital work for change underway, and also of the immense challenge ahead given the power of corporations and government entities (NSA, CIA, FBI, DoD, police, courts).

HCI, never the concentrating force AI has become, continues to steadily grow as a field—intellectually, in diversity, and in importance. It has a record of embracing diversity, helping to augment and advance human and computer symbiosis. More historical work on HCI is needed, but it offers a useful mindset.

Given AI historical scholarship to date, we know its record has been mixed from the start. From its first decades of 1950s and 1960s to today, DoD, NSA/CIA/FBI, Police, and criminal justice have been frequent funders, deployers and users of AI systems plagued with algorithmic biases that discriminate against BIPOC, women, the LGBTQIA, and the disabled. Some of the most harmful systems have been with facial recognition and predictive policing. Yet, properly designed, monitored, and maintained, AI offers opportunities for science, medicine, and social services (especially at universities and nonprofits).

The social science, humanities, and arts can have a fundamental positive role on the design, structuring, and policies with AI/ML. A handful of universities recently have launched interdisciplinary centers to focus on AI, history, and society. This includes the recently formed AI Now Institute at NYU (2017) and the Institute for Human-Centered AI at Stanford (2019). The Charles Babbage Institute has made the interdisciplinary social study of AI and HCI a focus (with “Just Code” and beyond)—research, archives, events, oral histories, and publications.  In CS, ACM’s (2018 launched) Conference on Fairness, Accountability, and Transparency (FAccT), offers a great forum. Outside academe many are doing crucial research, policy, and activist work—a few examples: Data for Black Lives; Blacks in Technology; NC-WIT; AnitaB.org; Algorithmic Justice League; Indigenous AI.Net; Algorithmic Bias Initiative, (U. of Chicago).

The lack of U.S. regulation to date, discrimination and bias, corporate focus and faith on tech fixes, inadequate transparency, corporate imperialism, and overpowering employees and competitors have many historical antecedents inside and outside computing. History—the social and policy history of AI and HCI, as well as other labor, race, class, gender, and disability history—has much to offer. It can be a critical part of a broad toolkit to understand, contextualize, and combat power imbalances—to better ensure just code and ethically shape and structure the ghost in the machine that learns.

Acknowledgments: Deep thanks to Bill Aspray, Gerardo Con Diaz, Andy Russell, Loren Terveen, Honghong Tinn, and Amanda Wick for commenting on a prior draft.


Bibliography

Allen, Danielle and Jennifer S. Light. (2015). From Voice to Influence: Understanding Citizenship in a Digital Age. University of Chicago Press.

Alexander, Jennifer. (2008). The Mantra of Efficiency: From Waterwheel to Social Control. Johns Hopkins University Press.

Bardini, Thierry. (2000). Bootstrapping: Coevolution and the Origins of Personal Computing. Stanford University Press.

Benjamin, Ruha. (2019). Race After Technology: Abolitionist Tools for the New Jim Code. Polity.

Card, Stuart K., Thomas Moran, and Allen Newell (1983). The Psychology of Human-Computer Interaction. Lawrence Erlbaum Associates.

Card, Stuart K., Oral History (2020). Conducted by Jeffrey R. Yost, Los Altos Hills, CA, February 17, 2020. CBI, UMN.

Dick, Stephanie. (2020). “NYSIIS, and the Introduction of Modern Digital Computing to American Policing.” Just Code: Power, Inequality, and the Global Political Economy of IT (Symposium presentation: Oct. 23). [Hereafter “Just Code” Symposium]

D’Ignazio, Catherine and Lauren Klien. (2020). Data Feminism. MIT Press.

Dray, Susan, Oral History (2020). Conducted by Jeffrey R. Yost, CBI, Minneapolis, Minnesota, January 28, 2020. CBI, UMN.

Durkee, Alison. (2020). Democratic Senators Demand Facebook Answer For Its White Supremacist Problem.” Forbes. June 30. (accessed online at Forbes.com).

Dryer, Theodora. (2020). “Streams of Data, Streams of Water: Encoding Water Policy and Environmental Racism.” Just Code” Symposium.

Edmonds, Ernest. (1974). “A Process for the Development of Software for Non-Technical Users as an Adaptive System.” General Systems 19, 215-218.

Eubanks, Virginia. (2019). Automating Inequality: How High-Tech Tools Punish, Police, and Punish the Poor. Picador.

Galison, Peter. (1999) “Trading Zone: Coordinating Action and Belief.” In The Science Studies Reader, ed. by Mario Biagioli. Routledge. 137-160.

Grudin, Jonathan. (2017). From Tool to Partner: The Evolution in Human-Computer Interaction. Morgan and Claypool.

Hiltzik, Michael. (2009). Dealers in Lightening: Xerox PARC and the Dawn of the Computer Age. HarperCollins.

Kahnman, Daniel. (2011). Thinking, Fast and Slow. Farrar, Straus, and Giroux.

Kidder, Tracy. (1981). Soul of a New Machine. Little, Brown, and Company.

Lampson, Butler, Oral History (2014). Conducted by Jeffrey R Yost, Cambridge, Massachusetts, December 11, 2014. Charles Babbage Institute, UMN.

Lauer, Josh and Kenneth Lipartito. (2020) “Infrastructures of Extraction: Surveillance Technologies in the Modern Economy.” Just Code” Symposium.

Light, Jennifer S. (2005). From Warfare to Welfare: Defense Intellectuals and Urban Problems in Cold War AmericaUniversity of Chicago Press.

McIlwain, Charlton. (2020). Black Software. The Internet and Racial Justice: From AfroNet to Black Lives Matter. Oxford University Press.

Mendon-Plasek, Aaron. (2021). “Mechanized Significance and Machine Learning: Why It Became Thinkable and Preferable to Teach Machines to Judge the World.” In J. Roberge and M. Castelle, eds. The Cultural Life of Machine Learning. Palgrave Macmillan, 31-78.

Menking, Amanda and Jon Rosenberg. (2020). WP:NOTWP:NPOV, and Other Stories Wikipedia Tells Us: A Feminist Critique of Wikipedia's Epistemology.” ScienceTechnology, & Human Values, May, 1-25. 

Metz, Cade and Daisuke Wakabayashi. (2020). “Google Researcher Says She was Fired Over Paper Highlighting Bias in AI.” New York Times, Dec. 2, 2020.

Norman, Don, Oral History. (2020). Conducted by Jeffrey R. Yost, La Jolla, California, February 12, 2020. CBI, UMN.

Noble, Safiya Umoja. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. New York University Press.

Petrick, Elizabeth. (2015). Making Computers Accessible: Disability Rights and Digital Technology. Johns Hopkins University Press.

Turner, Fred. (2010). From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism. University of Chicago Press.

Vinsel, Lee and Andrew L. Russell. (2020). The Innovation Delusion: How our Obsession with the New has Disrupted the Work That Matters Most. Currency.

Yost, Jeffrey R. (2016). “The March of IDES: Early History of Intrusion Detection Expert Systems.” IEEE Annals of the History of Computing 38:4, 42-54.

Yost, Jeffrey R. (2017). Making IT Work: A History of the Computer Services Industry. MIT Press.

 

Yost, Jeffrey R. (March 2021). “Of Mice and Mentalité: PARC Ways to Exploring HCI, AI, Augmentation and Symbiosis, and Categorization and Control". Interfaces: Essays and Reviews in Computing and Culture Vol. 2, Charles Babbage Institute, University of Minnesota, 12-26.


About the author:  Jeffrey R. Yost is CBI Director and HSTM Research Professor at the University of Minnesota. He has published six books (and dozens of articles), most recently Making IT Work: A History of the Computer Services Industry (MIT Press, 2017) and FastLane: Managing Science in the Internet World (Johns Hopkins U. Press, 2016) [co-authored with Thomas J. Misa]. He is a past EIC of IEEE Annals of the History of Computing, and current Series Co-Editor [with Gerard Alberts] of Springer’s History of Computing Book Series.  He has been a principal investigator of a half dozen federally sponsored projects (NSF and DOE) on computing/software history totaling more than $2 million. He is Co-Editor [with Amanda Wick] of Interfaces: Essays and Reviews in Computing & Culture.


 

2021 (Vol. 2) Articles

The Cloud, the Civil War, and the “War on Coal”

Paul E. Ceruzzi, National Air and Space Museum, Smithsonian Institution

Abstract: The term “The Cloud” has entered the lexicon of computer-speak along with “cyberspace,” the Matrix,” the “ether,” and other terms suggesting the immateriality of networked computing. Cloud servers, which store vast amounts of data and software accessible via the Internet, are located around the globe. This essay argues that this “matrix” has an epicenter, namely the former rural village of Ashburn, Virginia. Ashburn’s significance is the result of several factors, including northern Virginia’s historic role in the creation of the Internet and its predecessor, the ARPANET. The Cloud servers located there also exist because of the availability of sources of electric power, including a grid of power lines connected to wind turbines, gas-, and coal-fired plants located to its west—a “networking” of a different type but just as important.

(PDF version available for download.)

 

civil war map
A map of Northeastern Virginia and vicinity of Washington prepared by the Union Army in 1862. Labels insert by Author indicate 'Cloud' sites.

In In his recent book, Making IT Work, Jeffrey Yost quotes a line from the famous Joni Mitchell song, “Clouds”: “I really don’t know clouds at all.” He also quotes the Rolling Stones’ hit, “[Hey, you,] Get off my Cloud.” Why should a business or government agency trust its valuable data to a third-party, whose cloud servers are little understood? No thank you, said the Rolling Stones; not until you can explain to me just what the Cloud is and where it is. Yost gives an excellent account of how cloud servers have come to the fore in current computing. Yet Joni Mitchell’s words still ring true. Do we really know what constitutes the “Cloud”?

A common definition of the Cloud is that of sets of high-capacity servers, scattered across the globe, using high-speed fiber to connect the data stored therein to computing installations. These servers supply data and programs to a range of users, from mission-critical business customers to teenagers sharing photos on their smartphones. What about that definition is cloud-like? Our imperfect understanding of the term is related to the misunderstanding of similar terms also in common use. One is “cyberspace,” the popularity of which is attributed to the Science Fiction author William Gibson, from his novel Neuromancer, published in 1984. Another is “The Matrix”: the title of a path-breaking book on networking by John Quarterman, published in 1990 at the dawn of the networked age. The term came into common use after the award-winning 1999 Warner Brothers film starring Keanu Reeves. (Quarterman was flattered that Hollywood used the term, but he is not sure whether the producers of the film knew of his book.) In the early 1970s, Robert Metcalfe, David Boggs, and colleagues at the Xerox Palo Alto Research Center developed a local area networking system they called “Ethernet”: suggesting the “luminiferous aether” that was once believed to carry light through the cosmos.

These terms suggest an entity divorced from physical objects—pure software independent of underlying hardware. They imply that one may dismiss the hardware component as a given, just as we assume that fresh, drinkable water comes out of the tap when we are thirsty. The residents of Flint, Michigan know that assuming a robust water and sewerage infrastructure is hardly a given, and Nathan Ensmenger has reminded us that the “Cloud” requires a large investment in hardware, including banks of disk drives, air conditioning, fiber connections to the Internet, and above all, a supply of electricity. Yet the perception persists that the cloud, like cyberspace, is out there in the “ether.”

Cloud server, Ashburn.
Cloud server, Ashburn. Note air conditioning units on the roof. Photo by the Author.

Most readers of this journal know the physical infrastructure that sustains Ethernet, Cyberspace, and the Cloud. I will go a step further: not only does the Cloud have a physical presence, but it also has a specific location on the globe: Ashburn, Virginia.

map prepared by the Union Army in 1862 of Northern Virginia shows the village of Farmwell, and nearby Farmwell Station on the Alexandria, Loudoun, and Hampshire railroad. The town later changed its name to Ashburn, and it lies just to the north of Washington Dulles International Airport. In the early 2000s, as I was preparing my study of high technology in northern Virginia, Ashburn was still a farming community. Farmwell Station was by the year 2000 a modest center of Ashburn: a collection of buildings centered on a general store. The railroad had been abandoned in 1968 and was now the Washington and Old Dominion rail-trail, one of the most popular and heavily traveled rails-to-trails conversions in the country. Thirsty hikers and cyclists could get refreshment at the general store, which had also served neighboring farmers with equipment and supplies.

“a:” root server bike trail
Former location of the “a:” root server of the dot.com and dot.org registries (far left), Herndon. Photo taken by the author from the W&OD rail trail.

Cycling along the trail west of Route 28 in 2020, one saw a series of enormous low buildings, each larger than the size of a football field, and surrounded by a mad frenzy of construction, with heavy equipment trucks chewing up the local roads. Overhead was a tangle of high-tension electrical transmission towers, with large substations along the way distributing the power. The frenzy of construction suggested what it was like to have been in Virginia City, Nevada, after the discovery and extraction of the Comstock Lode silver. The buildings themselves had few or no markings on them, but a Google search revealed that one of the main tenants was Equinix, a company that specializes in networking. The tenants of the servers try to avoid publicity, but the local chambers of commerce, politicians, and real estate developers are proud to showcase the economic dynamo of the region. A piece on the local radio station WTOP on November 17, 2020, announced that “Equinix further expands its big Ashburn data center campus,” quoting a company spokesperson saying that “…its Ashburn campus is the densest interconnection hub in the United States.” An earlier broadcast on WTOP reported on the activities of a local real estate developer, that “Northern Virginia remains the ‘King of the Cloud’” In addition to Equinix, the report mentioned several other tenants, including Verizon and Amazon Web Services.

These news accounts are more than just hyperbole from local boosters. Other evidence that indicates that, although cloud servers are scattered across the globe, Ashburn is indeed the navel of the Internet.

In my 2008 study of Tysons Corner, Virginia, I mentioned several factors that led to the rise of what I then called “Internet Alley.” One was the development of ARPANET at the Pentagon, and later at a DARPA office on Wilson Blvd. in Rosslyn. Another was the rise of the proto-Internet company AOL, headquartered in Tysons Corner. Also, in Tysons Corner was the location of “MAE-East”—a network hub that carried a majority of Internet traffic in its early days. The root servers of the dot.com and dot.org registry were once located in the region, with the a: root server in Herndon, later moved to Loudoun County. The region thus had a skilled workforce of network-savvy electrical and computer engineers, plus local firms such as SAIC and Booz-Allen who supported networking as it evolved from its early incarnations.

APRA signs
Plaques at 1401 Wilson Blvd., Rosslyn, commemorating the work of ARPA’s Information Processing Techniques Office, whose offices were in this building. The binary code spells “ARPANET” in 8-bit ASCII. Photos by the author.
 

Around the year 2000, while many were relieved that the “Y2K” bug had little effect on mainframe computers, the dot.com frenzy collapsed. The AOL-Time Warner merger was a mistake. But there was an upside to the boom-and-bust. In the late 19th and early 20th Century the nation experienced a similar boom and bust of railroad construction. Railroads went bankrupt and people lost fortunes, But the activity left behind a robust, if overbuilt, network of railroads that served the nation well during the mid and late 20th century. During the dot.com frenzy, small firms like Metropolitan Fiber dug up many of the roads and streets of Fairfax and Loudoun Counties and laid fiber optic cables, which offered high speed Internet connections. After the bust these became unused— “dark fiber” as it was called. Here was the basis for establishing Cloud servers in Ashburn. By 2010, little land was available in Tysons Corner, Herndon, or Reston, but a little further out along the W&OD rail-trail was plenty of available land.

That leaves the other critical factor in establishing Cloud servers—the availability of electric power. While some Cloud servers are located near sources of wind, solar, or hydroelectric power, such as in the Pacific Northwest, Northern Virginia has few of those resources. The nearest large-scale hydroelectric plant, at the Conowingo Dam, lies about 70 miles to the north, but its power primarily serves the Philadelphia region. (That plant was the focus of the classic work on electric power grids, Networks of Power, by Thomas Parke Hughes.) To answer the question of the sources of power for Ashburn, we return to the Civil War map and its depiction of the Alexandria, Loudoun, and Hampshire, later known as the Washington and Old Dominion Railroad.

The origins of that line go back to the 1840s, when freight, especially coal, from the western counties of Virginia were being diverted to Baltimore, Maryland over the Baltimore and Ohio Railroad. In response, Virginians chartered a route west over the Blue Ridge to the mineral and timber rich areas of Hampshire County. (After 1866 Hampshire County was renamed Mineral County, in the new state of West Virginia.) The Civil War interrupted progress in construction, and after several challenges to its financial structure, the line was incorporated as the Washington and Old Dominion Railway Company in 1911. It never reached farther than the summit of the Blue Ridge, and the proposed route to the west would have had to cross rugged topography. The line could never have competed with the B&O’s water level route. The shortened line soldiered on, until finally being abandoned in 1968, making way for the rail-trail conversion. One interesting exception was a short spur in Alexandria, which carried coal to a power plant on the shore of the Potomac. That plant was decommissioned in 2014, thus ending the rail era of the Alexandria, Loudoun, and Hampshire.

W&OP freight train
Washington & Old Dominion freight train, shortly before the line’s abandonment in 1968.

 

In 1968, the rails-to-trails movement was in its infancy. Most of the freight once carried by rail was now being carried by trucks, and there was little room for rail-dependent industries to survive in a region of fast-growing residential suburban towns. There was little reason to assume that the right of way would revert to local landowners and be developed for commercial and residential use. That was the fate of the line west from Purcellville to the summit of the Blue Ridge at Snickers Gap. But the rest of the right of way was preserved. Shortly before abandonment, the Virginia Electric Power Company entered in to an agreement with Virginia Highway Department to purchase most of the remaining right of way as a conduit for high-voltage power lines, which would supply electric power to Northern Virginia. The agreement was criticized at the time, but among its results was the preservation of the right of way, making way for the establishment of the W&OD rail-trail by the Northern Virginia Regional Park Authority. As mentioned above, the trail is very popular for hiking, cycling, and horseback riding. Most of its users do not mind the overhead power lines above the trail. Given the rapid growth of suburbia in Fairfax and Loudoun counties, the trail could not have had the rural character common to many rail-trails in the country.

The power lines tell us how electric power gets to the Cloud servers. Where the power comes from is more complex. At the time Virginia Electric was negotiating for the right of way, the Engineering Firm Stone and Webster was building a power plant at Mount Storm, in Grant County, West Virginia. The plant was located in the heart of rich coal deposits. Upon its completion, the plant had a capacity of 1,600 Megawatts. Beginning in the early 2000s, the plant’s output was supplemented by a set of wind turbines located along the Allegheny Front – the divide between waters that flow directly to the Atlantic and those that flow to the Ohio and Mississippi Rivers. These turbines supply an additional 264 Megawatts of power.

The Alexandrian, Loudoun, and Hampshire Railroad never was completed far enough west to carry coal from the western mountains. Its charter, however, has been fulfilled, as the right of way now carries energy in the form of electricity generated by coal and wind from those mountains. The railroad’s founders were not thinking of Cloud servers, but today’s Cloud is powered, at least in part, by coal.

End of the line
“End of the Line,” 2014—last remnant of W&OD railroad, used to deliver coal to a power plant in Alexandria, before its decommissioning. Photo by the author.
 

Coal mining in West Virginia and western Maryland is in a precipitous decline. Within a few years it may vanish altogether. Those involved with the construction and management of Data Centers in Loudoun County have stated that those centers will reduce their dependency on coal to zero by the next decade. In addition to converting to natural gas, described below, Virginia is supporting the further development of wind turbines, increasingly located offshore as well as in the mountains. Data centers are also exploring the use of geothermal resources.

These efforts will help reverse disturbing trends of global climate change, but the decline of the coal industry has been devastating to the western Virginia, western Maryland, and West Virginia economy, which is experiencing lay-offs and unemployment among miners and railroad workers. The cause is not the so-called “war on coal,” allegedly waged by Washington politicians. The primary cause is the development of hydraulic fracturing, or “fracking,” of rock, which allows rapid unlocking of natural gas deposits in Appalachia. The technique does require labor, but not on the scale of traditional coal mining. And it is transported not by rail but by pipelines—buried under the ground and largely invisible. Natural gas burns much cleaner than coal.  Fracking has allowed natural gas to supplant coal for most new power plants. It also hastened the conversion of older, coal-fired plants to gas. An 800-Megawatt plant in Dickerson, Maryland, across the Potomac from Loudoun County and another major supplier of energy to the region, converted from coal to gas at the end of 2020. A similar conversion has taken place at the Chalk Point, Maryland plant. As of this writing, the Mont Storm plant remains coal-fired.

Power lines along the bike trail
Power lines, Ashburn. A second set of lines was added after the Panda Stonewall power plant came on line. Photo by the author.

In 2017, the “Panda Stonewall” power plant came on-line. It is located south of Leesburg, a few miles west of Ashburn. The primary market for its 778-Megawatt output is the cloud complex at Ashburn. In promotional literature, the plant’s owner, Panda Power Funds of Dallas, Texas, touts its clean-burning natural gas fuel. The gas is transmitted by pipeline from the Marcellus Shale deposits centered in western Pennsylvania. To handle this new source of power, new substations and overhead lines were built over and beside the W&OD trail from Leesburg to Ashburn.

 Conclusion

The center of the Cloud is in Ashburn, Virginia. It runs on a variety of energy sources, including coal, wind and Marcellus Shale gas deposits. Cloud servers are indeed scattered across the globe, but in Ashburn one can observe first-hand the dramatic transformation of computing. The servers require  electric power, the sources of which: wind, solar, hydro, coal, and gas, all have environmental impacts. In his study of the Cloud, Jeffrey Yost mentioned the two songs by Joni Mitchell and the Rolling Stones. To those I would add a third: the jazz album by the Polish bassist Miroslav Vitus, “Mountain in the Clouds.” The title suggests the serenity and ethereal nature of the cloud, but the music is quite different: a cacophony of clashing instruments, driven by a frenzied drummer and bass line, suggesting the frenzy of cloud construction in Northern Virginia.


Bibliography 

Bechtel Corporation, “Virginia Power Plant is one of the nation’s cleanest.” https://www.bechtel.com/projects/stonewall-energy-center/ accessed 12/10/2020.

Ceruzzi, Paul E. (2008). Internet Alley: High Technology in Tysons Corner, 1945-2005. Cambridge, MA: MIT Press. 

Ensmenger, Nathan. (October 2018). “The Environmental History of Computing,” Technology & Culture, 59/4 Supplement, pp. S7-S33.

Equinix Corporation, “Equinix further expands its big Ashburn data canter campus,” also https://www.equinix.com/data-centers/americas-colocation/. accessed 12/10/2020.

“Crushing it: The world is finally burning less coal. It now faces the challenge of using almost none at all,” The Economist, December 5, 2020, pp. 25-28.

Hughes, Thomas Parke. (1983). Networks of Power: Electrification in Western Society, 1880-1930, Baltimore, Johns Hopkins University Press.

National Public Radio, “Supreme Court Says Pipeline May Cross Underneath Appalachian Trail,” Broadcast June 15, 2020, 6:09 PM ET. https://www.npr.org/2020/06/15/877643195/supreme-court-says-pipeline-may-cross-underneath-appalachian-trail. Accessed 12/10/2020.

Vitous, Miroslav, “Mountain in the Clouds,” Atlantic Records, SD 1622, 1975. Hear a YouTube recording of the initial track: https://www.youtube.com/watch?v=zafIe4Aduus">https://www.youtube.com/watch?v=zafIe4Aduus. Accessed 12/21/2020.

WTOP Radio, “Northern Virginia retains the ‘king of the cloud’,” https://wtop.com/business-finance/2020/09/northern-virginia-remains-the-king-of-the-cloud/ accessed 12/10/2020.

Williams, Ames W. (1984). Washington & Old Dominion Railroad, 1847-1968. Meridian Sun Press, p. 109.

Yost, Jeffrey R. (2017). Making IT Work: A History of the Computer Services Industry. Cambridge, MA: MIT Press.

 

Paul E. Ceruzzi (January 2021). “The Cloud, the Civil War, and the “War on Coal”". Interfaces: Essays and Reviews in Computing and Culture Vol. 2, Charles Babbage Institute, University of Minnesota, 1-11.


About the Author:

Paul Ceruzzi is Emeritus Curator of Aerospace Electronics at the Smithsonian Institution's National Air and Space Museum. He is the author of several books on the history of computing and aerospace, including his most recent GPS​ (MIT Press 2018). His book on high technology in Northern Virginia, Internet Alley​, was published in 2008. He lives with his family in the Maryland suburbs of Washington, DC.


 

2020 (Vol. 1) Articles

 

Cultural Networks: Infrastructural Implications of AT&T’s Picturephone

Malinda Dietrich, University of Colorado, Boulder

Abstract: In 2020, video telecommunications seem ubiquitous. Between work and play, many people use a range of software to connect them with other people all around the world. This short essay begins to explore how we arrived at this seemingly universal technology by exhuming a failed technology: AT&T’s Picturephone. Through this historical exploration, we will come to see that infrastructure and culture are closely related, and that future work must be done to explore the social inequities that become apparent.

(PDF version available for download.)

 

Introduction

Regardless of how commonplace video calls are now, they are the result of continuous iterations of a failed technology from almost half a century ago called the Bell Laboratories’ “Picturephone.” First showcased during the 1964 World’s Fair, the first Picturephone demonstrated how people on a call could see one another, at 30 frames per second and in black and white, on small screens (“Mechanical ‘Brains’, Lasers”). The cylindrical device housed a screen on one end and connected to a handset that allowed the user to control the screen (Gertner). Using this technology, two men—one in New York and the other at Disneyland in Anaheim, California—completed the first transcontinental video call which lasted about ten minutes (Gerber; “Television Phone Used”). After the demonstration, visitors waited in line for a turn to use the machine and speak on one of the picture phones in one of six booths (Schnaars and Wymbs; see Figure 1). After the World’s Fair, market researchers conducted 700 interviews with individuals who attended the World’s Fair and used the Picturephone (Gertner; Schnaars and Wymbs). Most users rating the service well, and criticism focused on design features like the inability to turn off the video or adjust the height of the device (Schnaars and Wymbs). Bell Laboratories hoped that this would ignite widespread interest in the device, anticipating a Picturephone would live inside the homes of most people by the 1980s (see authors in the Bell Laboratories archive).

 

Figure 1: Picturephone Booth’s at the 1964 World’s Fair.
Figure 1: Picturephone Booth’s at the 1964 World’s Fair.

Scholars who have highlighted the failure of the Picturephone point to the many infrastructural issues (see Gertner; Lipartito; Schnaars and Wymbs). It was expensive to lay supplemental wires to transmit a video signal, increasing the price of the service, and meant that picture quality was poor. Missing from these accounts of the Picturephone is an understanding of infrastructure’s relationship to culture. Kenneth Lipartito begins to create this tie by focusing on the material and discursive formations of the Picturephone. His work focuses on the rhetoric of failure, using the Picturephone as an example (Lipartito). The technology worked from an engineering perspective; however, Bell had not considered that this service did not fill a consumer need, or that it needed a critical mass to take off (Schnaars and Wymbs). In other words, just because the infrastructure works, did not guarantee the popularity of this technology; therefore, this article tries to use the Picturephone as an historical artifact that allows us to speak through cultural relations to infrastructure.

“Picturing” the Future of Telecommunications in the Late 1800s

The idea of a Picturephone can be traced to around the same time that the telephone was taking off. On December 9th, 1879, George du Maurier published an illustration of “Edison’s Telephonoscope” in Punch magazine, in which two individuals are depicted using a technology that resembles a phone, megaphone, and television combined to communicate (see figure 2). While the reading of this image is contested—some arguing this image is a satire of Thomas Edison and his future inventions (Roberts)—others have projected that this image resembled what a video phone or a television would look like almost 100 years later (Burns). While this image could be read as one of the earliest conceptions of a video telephone, it was not until the 1920s that the first formal attempts at video conferencing began.

 “Edison’s Telephonoscope (transmits light as well as sound)”.
Figure 2: “Edison’s Telephonoscope (transmits light as well as sound)”. An illustration by George du Maurier. (Public Domain)

Video telecommunication, at the most simplistic level, requires sending images along with a telephone signal. By the 1920s, the cables and wires used for the telegraph as well as the telephone had been installed. Alexander Graham Bell had established the companies that would eventually become American Telephone and Telegraph Company (AT&T) in 1884 (“The History Brands”). The research branch of AT&T, Bell Laboratories, would work with one of the engineers and inventors of the first mechanical television (which was released in 1926) to complete the first video conference call in 1927 (Hanhardt, see also McGoogan). This call, a two-way audio connection with a one-way video connection, connected Secretary of Commerce Herbert Hoover and other officials in Washington, D.C., with AT&T president Walter Gifford in New York City (Turi). A few years later, in 1931, Bell Laboratories held the first public demonstration of a two-way videophone (Guernsey). The system for this demonstration used early television equipment and a closed circuit (Turi). Due to the Great Depression, further iterations of video telephones stalled; the technology lacked efficiency and reliability, and it needed further development that required more funding (Guernsey).

On December 23, 1947, the transistor was successfully demonstrated at Bell Laboratories (Shampo et al.). The transistor became an integral part of many (if not all) modern electronic devices—it served as an amplifier for power, removing the need to rely upon display tubes, as well as a switch for digital devices. Following this invention, and a few years to figure out how to implement it, the 1950s brought in more instances of images sent over telephones lines. In 1955, the Mayor of Palo Alto called the Mayor of San Francisco using a videophone developed by Kay Lab of San Diego (“Gawkie-Talkie”). According to a Chicago Tribune article, this video phone was anticipated to take off in the 1960s, particularly in factories and hospitals. About a year later, hospitals and the U.S. defense department demonstrated interest in X-Ray pictures being sent over the phone and Bell Labs developed their first phone that transmitted pictures along with sound (“Sending X-Ray Pictures”; “Phone that transmits”). The phone, not introduced as the Picturephone yet, included a 2-inch by 3-inch screen and utilized a pair of ordinary telephone wires to display one picture every two seconds (“Phone that transmits”; “Scientists See Picture-Phone”; Gould).

AT&T tried to use the wiring and other infrastructures available to them (and that they held a monopoly on) for the Picturephone. To Geoffrey C. Bowker and Susan Leigh Star, “good infrastructure is hard to find” (p. 33)—the easier the systems are to use, the more the material systems recede into the backgrounds of our minds. On one hand, the Picturephone demonstrated this theorization of infrastructure: tens of thousands transistors went into the creation of the picture telephone (Gertner, p. 190), while now, billions of transistors (now at the size of ten nanometers and continuing to grow smaller) go into a single central processing unit (Gertner, p. 209). On the other hand, using a Picturephone was clunky and not intuitive enough for the use in people’s homes.

Picturephone Model I: Not Your Average Payphone or Telephone Booth

A few months after the Picturephone’s appearance at the World’s Fair, Lady Bird Johnson assisted AT&T in kicking off its commercial Picturephone service, to make a call from Washington, DC to New York City. Mrs. Johnson spoke for five minutes to Dr. Elizabeth A. Wood, a scientist from Bell System’s laboratories, on a screen 4⅜-inches by 5¾-inches (“Picture Phones Go”; see figure 3). Unlike at the World’s Fair, individuals did not have to sit completely still to be seen by the other individual.

Lady Bird Johnson uses the Picturephone to make a call from Washington, DC to New York City.
Lady Bird Johnson uses the Picturephone to make a call from Washington, DC to New York City.

The Picturephone commercial service began in three cities: New York, NY; Washington, DC; Chicago, IL; and in a designated Picturephone booth (“Picture Telephone Ready”). New York’s booth was located in the Grand Central Station terminal, whereas Washington’s was located at the National Geographic Society Building, and Chicago’s in Prudential Insurance Building (“Picture Telephone Ready”). In order to use the Picturephone, users had to make an appointment for a particular time. The calls had differing costs: $16 for first three minutes between New York and Washington, $21 between Chicago and Washington, $27 between Chicago and New York (“Picture Phones Go”; this is a cost ranging from $130 to $224 in 2020). Bell Labs attempted to market these booths for many different purposes such as home buying, business communication, sales, and even demonstrating hair styling (Sloane). Some of these purposes worked for some individuals—one New Jersey couple did find and buy a home in Chicago from the Grand Central Station terminal (“N.J. Couple Selects”) while other businesses sought to limit travel (“Video Phone Held”). However, most people did not like the booth services, finding appointments burdensome and the cost excessive (Gertner). As a result, Bell opted to focus their attention towards companies and corporations.

Picture This: Company Usage of Video Phones

Beginning in 1967, the Picturephone began the slow process of integration within corporations. Per John Wilford, the invention of a new compact and more durable television camera tube would help the systems go into commercial trial the next year, in 1968 (p. F1). Wilford suggested similar uses to what was proposed for the booth services, such buying products or limiting business travel. He also wrote about the phone’s infrastructure: the Picturephone utilized the same wires as other telephones but required the use of two extra pairs for transmitting and receiving the video signals (Wilford). The Picturephone also relied upon “digital transmission systems” which, in Wilford’s words:

“take a telephone signal, either voice or video, and turn its waveforms into electrical voltages represented by computer language [binary]. It then breaks the signal into a stream of coded electrical pulses...each [telephone] line is capable of handling several million pulses a second.” (p. F1)

Over a year and a half later, another article was published by Frank Wells, detailing, again, how Picturephone service is “feasible” and expected to be offered commercially in early 1970 (p. 8). In Wells’s article, he discussed how a new L-4 coaxial cable system would be capable of carrying 32,400 telephone conversations at the same time, and he concluded that this would be beneficial for business communication by limiting travel (p. 8).

In 1969, Westinghouse, an electric company based in Pittsburgh, PA, became one of the first companies to exhibit “how well” the AT&T Picturephone “worked” (“Westinghouse Tests New”, pg. 51). Westinghouse signed a six-month contract to test 40 Picturephone sets, 29 in Pittsburgh and 11 in New York (“Westinghouse Tests New”). AT&T installed the Picturephone Mod II, which included a wider screen at 5½-inch wide by 5-inches high (“Westinghouse Tests New”), controls that allowed the user to position the camera height and boost the contrast of the image on the screen (“Westinghouse Tests New”), and features such as faxing and group videoconferencing (Schnaars and Wymbs). It is interesting that AT&T was trying to include faxing into these Picturephones, as the facsimile or fax machine was a technology that failed in certain historical moments and flourished in others (Coopersmith).  

In trying to garner more business interest in the device, the May/June 1969 edition of the Bell Laboratory Record detailed how the Picturephone worked and how it could be easily integrated into professional life. In particular, secretaries (or “attendants” per the article) would be able to utilize the voice-only functions of the service in communicating with their boss, and they would serve as a type of operator internally (Harris and Williams). These jobs fell primarily on women (see Bureau of Labor Statistics: https://www.bls.gov/mlr/2006/03/art3full.pdf), yet the Picturephone was created and marketed to primarily men.

The Pittsburgh commercial service became officially inaugurated on July 1st, 1970 (“Picture-Phone Service”; Janson; “Dial a Friend’s”). Thirty-eight sets of Picturephones were installed for eight companies in Pittsburgh (“Dial a Friend’s”). As reported by an article in the Chicago Tribune, the emphasis on the commercial service was due to high initial costs and charges (“Dial a Friend’s”). Installation charges cost $150 for the first year, and each company that has the Picturephones installed will pay $160 a month for service on the first set and $50 a month for each additional one—those costs do not include the 25¢ a minute charge if the company uses the Picturephone beyond 30 minutes a month. In today’s dollar, the companies paid over $1,300 a month to use a single Picturephone. Lipartito speaks to how, in an early survey, prospective users were willing to pay $125 a month to use the service; however, growth with consumers and infrastructure was integral to the technology’s success. Lawrence J. Barnhorst (who was the vice president and general manager of the Bell Telephone Company of Pennsylvania) is quoted as saying the primary objective of the company was to “reduce the cost” of the Picturephone to “be readily available to everyone”; the company was hopeful that mass production in the 1980s would reduce the rates to make the technology more affordable (“Dial a Friend’s”). In other words, AT&T set the Picturephone price based on previous customer research, as well as the infrastructural cost. With this cost set, Donald Janson of The New York Times also anticipated that by 1975, over 100,000 Picturephones would be in use. Janson equated the extreme costs of that of the initial costs of the transatlantic and transcontinental telephone wires: what originally cost $75 in 1927, became $6 in 1964 with more popularity and use. The Picturephone, however, was still struggling to find its market for its popularity to rise in the first place.

With the now 5-inch x 5½-inch screen that displayed a “clear and sharp” image in black and white, the Model II (Mod II) Picturephone had address privacy concerns and expressed new avenues of use (p. 1). For instance, when the Picturephone integrated into home use, these at-home users could shop by Picturephone, visit a hospital, hold a family reunion, or attend a lecture (“Dial a Friend’s”). Multiple articles even mentioned increased accessibility for those hard of hearing who could read lips using the Picturephone (“Dial a Friend’s”). Commercially, these phones could be used to communicate with a computer; these units, called “Data Sets” would make it possible for a Picturephone to display information from the computer (“Dial a Friend’s”). Janson stated that Bell Labs was working on color and three-dimensional images, as well as expanding the services to other cities beyond Pittsburgh, with Chicago being the next city (Janson; “Dial a Friend’s”).

This technology and the other networked technological structures have served as infrastructure for future projects and aspects of technologies we use today. The transistor is one example, but we can look to the Picturephone for a few other examples such as the sending of images, like X-rays, over telephone lines (“Gawkie-Talkie”). When the Picturephone was first being tested, hospitals were thought to be one of the main institutions to utilize the technology and even the defense department showed interest in being able to transmit images over these wires (“Sending X-Ray Pictures”). While there is a whole (separate) history on the transmission of wire photos, or images sent over telegraph and telephone lines, the Picturephone and the funds AT&T spent on adding more wiring for the installation, laid the groundwork for future transmissions. On a related note, the Picturephone also gave AT&T an opportunity to further study and improve the changes between early digital signals in order to transmit signals a further distance. Finally, someone could argue that the Picturephone, particularly through the “Data Sets,” was one of the first instances of a Graphic User Interface (GUI). To be clear, AT&T successfully avoided any anti-trust issues (such as explicitly moving into computational territory, which they were not permitted to do based on a 1956 anti-trust case) by using telephone wires (Lipartito). While these “data sets” were not (and are still not) recognized in the histories of GUI, likely because of this case, I believe that by allowing individuals to more easily “communicate” with their computers, the data sets proved to be a feature (and infrastructure) paramount to most of our computer usage today.

The Fizzle & Failure of the Picturephone

In January 1971, Jesse Glasgow published an article detailing how businesses in Maryland were hoping to receive Picturephone services for conference calls; however, Glasgow described the Picturephone, which at this point had been commercialized for six months, as still “in its infancy” (Glasgow, p. K7). Around the one-year anniversary of the Picturephone’s commercialization in Pittsburgh, Boyce Rensberger published an article describing the disappointment with the product that the Bell Corporation was experiencing. Only 16 Picturephones had been installed since 1970. Rensberger quotes Robert Sweeney, the marketing manager for Bell in Pittsburgh, as saying that “the thing hasn’t really grown the way we thought it would...it’s turned out to be an extremely successful device from an engineering design point of view, but the trouble is we just haven’t found our market” (p. 26). Although Rensberger cites the economic slump as part of the issue, he also claims the lack of long-distance service did not help (p. 26). Rensberger also quotes Joseph C. Rengel, executive vice president for nuclear energy systems at Westinghouse, explaining that the Picturephone is “not a very personal form of communication. There’s no color. You’re gray. I’m gray” (p. 26).

Per Kenneth Lipartito, by the end of 1972, Pittsburgh only had 32 sets of Picturephones in service. Chicago, the only other city to receive commercial service, peaked in 1973 at 453 users (Lipartito, p. 52). Picturephone existed through 1974 with a few customers paying $87.50 for the service, but by 1978, the remaining devices only existed on the desks at Bell Labs before quietly being removed (Lipartito, p. 52). The Picturephone had socially slipped and fiscally failed—only gathering the interest of a fraction of Bells users and costing the company around $500 million.

From an engineering standpoint, the technology of the Picturephone worked—it completed the job it was technically supposed to be completing. However, just because something works, does not mean that people will use it. People support infrastructures just as they support us and our activities. The telegraph, and even early telephone usage, had people serving as infrastructure through their positions as operators. If we consider the Picturephone, without the use of individuals (such as another person to call and use video with), it is meaningless that the multiple pairs of telephone wires can carry the audio and visual information. Our connections to others are what gives meaning to these telecommunication technologies.

In fact, the Picturephone was grappling with similar issues that video conferencing continues to deal with today: privacy and connection overload (which, like everything else I’m talking through, are interconnected). While AT&T had monopolized the telegraph and telephone wires, meaning they had over 25 million people they could potentially connect through video (Gertner, p. 192), folks did not necessarily want to be connected and on video in their own homes. Even today, many folks feel like there is a context collapse—people on the other end of the video call are seeing a space that many consider to be their own, private space. Neither of these points even consider how our feelings of privacy are further contested in a digital information age (see articles on Zoom’s lack of end-to-end encryption and data management). In terms of connection, people also face greater fatigue in having to not only be present in a similar manner to other face-to-face conversations, but in having to expend more energy in trying to read social cues over video. Particularly aspects of nonverbal communication might be easier to understand over video, but this form of mediated communication does not replace proximity.

A patent for one of the later iterations of the Picturephone.
Figure 4: A patent for one of the later iterations of the Picturephone.

Phoning It In: Learning from Past Mistakes

The Picturephone is considered a technological failure to AT&T (who lost around $500 million on the device, (see Gertner) and others (Lipartito; Guernsey). As everyone is grappling with the COVID-19 global pandemic in 2020, video telecommunication is more integral to our daily lives than ever. By drawing on AT&T’s Picturephone, we can better understand how the technical replies upon the cultural for productive adoption and infiltration into everyday life. In a broader sense, relying upon examples of the past can help us better understand what is happening in the current moment. It can also help us understand how we got to the next new thing by examining how previous, “failed” technologies serve as an infrastructure for what we now take for granted.

One issue still occurring in our moment of video telecommunication technologies is an inequity of access. Although this essay attempts to unpack our “connections” to infrastructure and culture, in order to tease out the humanity in this historical narrative, I want to acknowledge that this “humanness” falls short. The dominant history of the Picturephone, like many earlier technologies, has been recorded that this technology was created by, marketed to, and used by primarily white men. An intersectional approach is not a strength of this narrative. While social movements, like the civil rights movement, occurred concurrently during the time period, there is also a dearth in accessible records from the time period. More research needs to be done to find information on if the 1964 World's Fair or early Picturephone booths were considered segregated spaces, and further explanation is necessary to further contextualize the time period in conjunction with the creation of this technology. All of these points feed into larger issues of systemic racism, which continually gets perpetuated through processes such as biased formats of inscribing history, issues with search algorithms, and more.

Ultimately, we expect to learn from our shortcomings and mistakes of the past in order to make video telecommunications as ubiquitous as it seems to most people (particularly in the Western world). However, there is still plenty of work to be done, and it begins with our connections bringing meaning to the infrastructure and culture around video telecommunication technology.


 

Bibliography

“1870s-1940s Telephone.” Imagining the Internet: A History and Forecast,

https://www.elon.edu/e-web/predictions/150/1870.xhtml. Accessed 4 Aug. 2020.

Bowker, Geoffrey C., and Susan Leigh Star. (1999). Sorting Things Out: Classification and Its Consequences. MIT Press.

Burns, Russell W. (1998). Television: an international history of the formative years. No. 22.

Coopersmith, Jonathan. (2015). Faxed: The Rise and Fall of the Fax Machine. JHU Press.

Davis, C. G. (1969). “Getting the Picture.” Bell Laboratories Record, pp. 143–48.

“Dial a Friend’s Face! Picture Phone Service Is Begun.” (1970). Chicago Tribune, p. 1.

Dorros, Irwin. (1969) “Picturephone.” Bell Laboratories Record, pp. 137–41.

“Gawkie-Talkie Telephone Is Here at Last! It Not Only Hears You, It Sees You, Too!” (1955). Chicago Tribune, p. B7.

Gertner, Jon. (2012). The Idea Factory: Bell Labs and the Great Age of American Innovation. The Penguin Press.

Glasgow, Jesse. (1971). “C.&P. Man Sees Future In Picture Phone.” The Sun, p. K7.

Gould, John. (1956). “Picture, Please! Now You’ll Be Able to See While You Talk on the Newest Version of Mr. Bell’s Invention.” The New York Times, p. SM11.

Graham, Stephen, and Nigel Thrift. (2007). "Out of order: Understanding repair and maintenance." Theory, Culture & Society, 24.3. p. 1-25.

Guernsey, Lisa. (2000). “Cautionary Tale: The Perpetual Next Big Thing.” New York Times, p. G8.

Hanhardt, John G (1981). "The First Mechanical Television." Journal of the University Film Association 33.2. p. 33-34.

Harris, J.R. and R.D. Williams. (1969) “Video Service for Business.” Bell Laboratories Record, pp. 149–53.

Janson, Donald. (1970). “Picture-Telephone Service Is Started in Pittsburgh.” The New York Times, p. 1.

Korn, F. A., and A. E. Ritchie. (1969). “Choosing the Route.” Bell Laboratories Record, pp. 157–59.

Lipartito, Kenneth. (2003). “Picturephone and the Information Age: The Social Meaning of Failure.” Technology and Culture, vol. 44, no. 1, pp. 50–81.

Lee, John M. (1964). “Mechanical ‘Brains’, Lasers and 2-Way Picture Phones Are Shown by Industry.” The New York Times, p. 25.

MacDougall, Robert (2014). The People's Network: The Political Economy of the Telephone in the Gilded Age, University of Pennsylvania Press.

McGoogan, Cara (2016). “Who Invented the Television? How People Reacted to John Logie Baird’s Creation 90 Years Ago.” The Telegraph, https://www.telegraph.co.uk/technology/google/google-doodle/12121474/Who-invented-the-television-John-Logie-Baird-created-the-TV-in-1926.html.

Molnar, Julius P. (1969). “Picturephone Service- A New Way of Communicating.” Bell Laboratories Record, pp. 134–35.

“N.J. Couple Selects Home by ‘Picturephone.’” (1965). The New York Times, p. R6.

Parks, Lisa, and Nicole Starosielski, eds. (2015). Signal traffic: Critical studies of media infrastructures. University of Illinois Press.

“Phone That Transmits Pictures Along With Sound Is Developed.” (1956). The New York Times, p. 29.

“Picture Phones Go Into Service: Mrs. Johnson Is One of First to Use New Device.” (1964). The New York Times, p. 24.

“Picture-Phone Service.” (1970). South China Morning Post, p. 1.

“Picture Telephone Ready Next Month; Will Link 3 Cities.” (1964). The New York Times, p. 39.

Roberts, Ivy. (2017). “‘Edison’s Telephonoscope’: The Visual Telephone and the Satire of Electric Light Mania.” Early Popular Visual Culture, vol. 15, no. 1, pp. 1–25.

Schnaars, Steve, and Cliff Wymbs. (2004). “On the Persistence of Lackluster Demand—the History of the Video Telephone.” Science Direct, vol. 71, pp. 197–216, doi:10.1016/S0040-1625(02)00410-9.

“Scientists See Picture-Phone System On Way.” (1956). New Journal and Guide, p. D5.

“Sending X-Ray Pictures by Phone Tested By Radiologists Here and in Philadelphia.” (1956). The New York Times, p. 60.

Shampo, Marc A., Robert A. Kyle, and David P. Steensma. (2012). "William Shockley and the Transistor." Mayo Clinic Proceedings. Vol. 87. No. 6. Elsevier.

Sloane, Leonard. (1956). “Picturephone Helps to Sell Over Hundreds of Miles.” The New York Times, p.F1.

“Television Phone Used From Fair to California.” (1964). The New York Times, p. 31.

“The Evolution of Picturephone Service.” (1969). Bell Laboratories Record, pp. 160–61.

“The Historical Brands of AT&T.” Att.Com, https://about.att.com/innovation/ip/brands/history. Accessed 4 Aug. 2020.

Turi, Jon. (2014). “Look Who’s Talking: The Birth of the Video Phone.” Engadget, https://www.engadget.com/2014-09-07-look-whos-talking-the-birth-of-the-video-phone.html.

“Video Phone Held an Aid to Transit: Wide Use of New Instrument Could Reduce Traveling.” (1966). The New York Times, p. 86.

“Westinghouse Tests New Phone Units.” (1969). The New York Times, p. 51.

 

Malinda Dietrich (September 2020). “Cultural Networks: Infrastructural Implications of AT&T’s Picturephone.” Interfaces: Essays and Reviews in Computing and Culture Vol. 1, Charles Babbage Institute, University of Minnesota, 35-49.


About the author: Malinda Dietrich is a PhD student in the Communication department at the University of Colorado Boulder. She is interested in, most broadly, communication technologies. When not attempting to expand this project around AT&T’s Picturephone, she also working on a project on Bill Gates’s Open Letter to Hobbyists (as a computational historiography about software development), as well as a project that attempts to define the meaning of “data” from different people’s perspectives.


 

From Telecommuting to Mobile Work: The IBM Experience, 1890s-2020

James W. Cortada, Senior Research Fellow, Charles Babbage Institute, University of Minnesota

Abstract: IBM was an early practitioner of remote working, beginning in the 1890s, but expanding this way of working in the 1980s. Customer engineers, programmers, systems engineers, salesmen, and consultants participated. Mobile work posed its own operational problems but offered benefits for improved service, productivity, and employee morale. However, its motives and practices remained controversial.

(PDF version available for download.)

 

IBM brick 2
IBM "Brick" used by field engineers to communicate, receive assignments, and to order parts.

Introduction

With the pandemic in full swing this spring, many people began working from home, or elsewhere other than their office. The opulent campuses in Silicon Valley emptied and to the shock of many IT people it was a radical change. As of March 2020, it appeared that five million US employees worked from home, representing 3.6 percent of all labor. Other data suggests that up to 43 percent of office workers sometimes work from home, which sounds more realistic than 3.6 percent of all labor. IBM recently bragged that hundreds of thousands of its workers worked remotely, but then within a couple of months that many would be brought back into offices. IBM had made a similar announcement in 2017 after having bragged about how extensively its staffs worked remotely. But IBMers have been doing this since the 1890s, before they became C-T-R in 1911 and renamed IBM in 1924, and long before the Internet, iPhones, or clunky laptops. So, it appears we need some history.

Providing Onsite Service

IBM has always been an ecosystem of different types of workers: headquarter office employees, programmers and engineers in research laboratories, factory workers, customer engineers (CEs—they installed and repaired equipment and software), sales personnel, and consultants. CEs (under different titles over the century) worked since the 1890s wherever customers had tabulating equipment, later computers. About once a week they came to an IBM office to do their paperwork and meet with their managers. That continued right through the twentieth century. In research for this article I queried retired CEs. One who had started in the 1950s explained that his “tech support,” i.e., how he interacted with “Dispatch” and his manager, was “a phone booth.” Until the 1980s, they filled out “IRs,” on cards, which were “incident reports” with information on the nature of their work, repairs, customer name, and so forth. For decades, CEs kept Dispatch informed of where they were working, received instructions of where to go next, could check in to see about the availability of parts, and to otherwise keep these dispatch centers informed.

Then in 1983, the Field Engineering Division introduced the Data Communication System (DCS, Motorola KDT800 terminal), better known by several tens of thousands of employees as the “Brick,” because it looked like one. Built by Motorola operating in a radio network run by that company, it worked well and was used much like a tablet, continuing to minimize trips to the office, except to pick up parts and attend monthly meetings. In 1997, Motorola introduced a new generation of the machine. IRs and all manner of Dispatch and management communications were done remotely.

California Surfing on a Terminal

In 1979 a different chapter in IBM’s mobility history began when its laboratory at Santa Teresa in Silicon Valley gave five employees terminals to work at home. Productivity and morale was high, the technology functioned, and other lab employees there and in other locations began doing the same. By 1983, an estimated 2,000 employees in the U.S. worked a combination of at office and home. Their ability to program remotely expanded through the 1980s, facilitated by faster dial-up lines over the years and the ability to attach modems to PCs during the second half of the decade. Simultaneously, IBM expanded its e-mail network (PROFS) worldwide, with incremental ability for employees to access e-mail from home. Anecdotal evidence suggests programmers liked working remotely. Some worked at an IBM facility, but also had access to files and mainframes from home.

On critical projects, such as with product development and support, technical staff found it useful to have access to mainframe-hosted files after hours to troubleshoot issues at night or on weekends using PCs, initially with dial-up connections. That sped up analyses of problems, instead of waiting to do so after driving back to a plant location, such as at Boca Raton, Florida (home of PC development in the early 1980s) or to the Rochester, Minnesota facility, where additional collaboration on PC and other related issues required coordination between the two sites. They built on the experiences of programmers of the 1970s at the Palo Alto, California site that had used TTY terminals to develop APL.

But What About the Sales Force? Weren’t They Mobile?

In the 1910s, IBM had sales facilities called branch offices that looked more like retail operations, where customers came to learn about products. By the end of the 1920s and extending to the 1990s, branch offices existed where sales staff had either a desk in “bull pens” or offices, but they spent only about a third of their time in these facilities. The rest was spent traveling from customer to customer, working out of their buildings, communicating to other IBMers via telephone, later also with their IBM-issued laptops (1990s), because in their trade face-to-face communications with customers proved crucial. They and their customers also visited IBM’s education centers and labs to learn about new products and applications. Every manufacturing site entertained its customers this way. So, salesmen had considered themselves remote workers for nearly a century.

As branch offices were dismantled in the 1990s, replaced with hotel-styled workspaces that became popular with many companies, sales staff came into IBM facilities less frequently. They came in to pick up their mail, use a conference room to host meetings with customers, and kibitz with colleagues. By then, they were interacting with their managers via telephone, rarely in person. By the end of the 1990s it was not uncommon for a salesman or consultant to say that they had not seen their manager face-to-face in six months, although communications via telephone and e-mail were frequent, at least once a week. That reality still exists today. The lower in an organization one was the more likely they worked remotely without a desk or office to claim in an IBM building. In all eras, by the time one became a second line manager they had an office, although even that practice declined in the early 2000s.

Big Blue Goes All Out With Mobility

In the late 1980s IBM entered a period when revenues and profits began declining sharply, forcing management to cut expenses. The Real Estate Division, which was responsible for building factories, offices and laboratories and for renting space was put under enormous pressure to sell-off some, which it did. But what does one do with all those employees who used to go to large office towers in Paris, New York, Chicago, and elsewhere? The problem involved field personnel—salesmen, systems engineers, their management, and their administrative staffs. In the United States, beginning slowly largely in the Midwest, customer-facing employees and their administrative staffs were issued PCs, and IBM began paying for their slow dial-up lines. Some of these lines were also set up in customer facilities. One employee working with Kodak in 1994 recalled, “When I started out working from home IBM and TSS (a Kodak-IBM joint venture) offered us office furniture, printers, computers, paper and paid for by the company along with reimbursements for Internet access and voice telephone home office charges.” Later, the company was less generous. Salesmen, systems engineers, and consultants began working remotely in the early 1990s and by the end of the decade tens of thousands, rarely coming into an IBM office.

It worked again. People’s commutes declined; time spent driving to work often was now devoted to doing IBM’s. One employee recalled that everyone’s productivity “went through the roof.” Work/life balance improved, while managers who initially thought less work would be done learned that the opposite was true. Prior to going remote various surveys of how people spent their time at work reported that a third was consumed by internal meetings and paperwork. Meetings converted to conference calls, which all had to be scheduled, so it seemed the number declined, while remaining paperwork was sped up with email, automation, and online processing. IBM was able to discard over $2 billion in property, some 58 million square feet.

But there was an ugly side to the exercise. One employee involved in some of its earliest iterations explained: “As someone who was involved in IBM’s Mobility program when it was begun around 1993, the entire focus was to equip customer facing folks with the tools, laptops, cell phones, etc. to work from customer, office and home as needed, but it was never intended as a full-time work-at-home program. But when Real Estate found out how much savings $$ they could [gain] by ending leases and selling buildings they forced the issue.” He added, “after IBM booked all those Real Estate savings, they realized they had created something that disconnected IBMers from the Company, community and teams, and they tried to rectify that, but it wasn’t very successful.” Some employees, however, reported no loss of loyalty or connection to IBM. Many found working remotely a positive experience, as one put it, “It was good for me professionally as well as for my family.”

New Era, Consultants and More Mobility

When IBM entered the consulting business in the 1980s, and then in 2002 acquired 30,000 employees from PriceWaterhouseCoopers (PwC), the field force was well over 200,000. All were armed with laptops loaded with consulting and sales tools, other software tools, access to the Internet and to IBM’s myriad internal databases, and email. (My laptop from the 1990s above.) By the end of the 1990s, there were tens of thousands of employees who rarely—if ever—entered an IBM office, other than during the first couple of weeks of initial employee onboarding activities. By 2009 IBM reported that, “40 percent of IBM’s some 386,000 employees in 173 countries had no office at all.”

IBM learned much during its first three decades of the modern era of working remotely. Employees put in as many hours (or more) as they had in offices and plants, often diverting time spent commuting now to work; hence the company got more hours from their employees without IBMers feeling overworked. They worked in more optimal ways; morning people by 9:30 had been “at work” for three hours; other employees were night owls and so they were working at midnight. Most loved the 20-second commute from their kitchens to their “IBM office.” They could attend their children’s school events and be productive because they had more control over their calendars, their time. It worked. IBM lost some of the spontaneous serendipitous opportunities to convene impromptu meetings to banter around new ideas, although that could take place; it just had to be scheduled more than before. One had to be proactive, but also to expect imagination.

Learning Lessons

Management had much more to learn, however. People needed job descriptions, incentives, and appraisals that valued self-reliance and personal commitment to their jobs. So, hiring self-starters proved crucial, not simply bright people who needed to be told what to do. Those that did not fit the profile disliked this mode of work, often feeling isolated, and so drifted away. Management let it happen. Most employees adapted to the new way of working. Consultants from PwC came from a culture of working remotely at client locations. Everyone needed digital plumbing: high-speed internet access paid for by IBM (not employees), powerful laptops and state-of-the art smart phones. The company learned to budget for these, and whenever some manager did not, employees rightfully complained, or quit. It had to be a commitment made at the top of the firm, a lesson the company partially ignored in the 2010s when it went through another round of challenging business times.

Managers had to host online events to maintain community. I used to hold virtual staff meetings at a time agreed to by my teams, which I insisted we would hold “whether we needed to or not.” Every one of those staff meetings was always packed with things to talk about. My colleagues in management had the same experience. When online face-to-face meetings became available in the 2000s, these became more relevant, impactful, and employees looked forward to these. We are after all social creatures. It proved important to use such events to maintain cohesion, group spirit, and teaming.

In 2012 IBM entered a new period when revenues started to decline every quarter with the exception of one until 2020. Profits shrank, too, in the post 2017 period. To protect its balance sheet IBM went through relentless rounds of reducing operating costs. These involved attempting to sell off less profitable lines of business, such as the PCs and DASD, to dispose of buildings (largely factories), and to continuously lay off employees. An estimated 100,000 were laid off or otherwise pushed out of the company since 2010/2012. Nowhere was this more evident than among the ranks of the CEs, salesmen, systems engineers (now called IT architects), consultants, and administrative staffs. Factory workers were disposed of through sales of factories with bodies and buildings.

Another form of remote work that began in the 1990s and expanded in the 2000s receives little attention in discussions about mobile work. It is the wholesale movement of tasks from one country, or division, to another, famously Indian Help Desks at Microsoft, IBM, and other companies that come to mind. Labor costs were less in India—often 80 percent than in more “advanced” economies—and telecommunications were available and cheap. So, product development and customer and employee support services moved incrementally out of the U.S. and Europe to India. IBM India went from several thousand employees in the early 2000s to a rumored (never officially announced) 150,000 within a decade, and now closer to 100,000. Centers of Excellence, while still physical locations as they had been since the 1980s, interacted with customers and other IBM divisions more remotely over time. IBM was able to reduce its workforce population in the U.S. while still servicing its American markets, although often American IBMers and customers complained about eroding quality of support. Several tens of thousands of American workers were displaced between the late 1990s and the 2010s though this process. At IBM India, however, employees worked in cubicles on IBM campuses.

New Times New Issues

An important component of IBM’s workforce “rebalancing,” announced in 2017 involved forcing some groups of workers to stop working remotely, initially marketing and communications staffs. They were given the option of reporting to work at offices, often in other cities, which would require them to move their homes and families, normally at their own expense. If they chose not to do so, often within 30 to 60 days, they were considered to have “resigned” from the company. The strategy was implemented in waves between 2017 and the present, with the result that many employees in their ‘40s and ‘50s left IBM, rather than upset their personal lives. One press report stated what many IBMers were saying on their websites that this move back to the office was “a veiled method of shedding workers.” (Vassel, 2020). IBM said it was to improve productivity and collaboration for improved innovation. The media reported that various studies essentially demonstrated letting employees work wherever suited them improved performance and productivity. In 2017, IBM selected some 5,000 workers for a return to offices. The optics remained ugly.

Then the pandemic in 2020 led some 350,000 IBM employees to work remotely around the world. Many already had the necessary technical infrastructure and work culture to do so. But by late spring it began to appear that perhaps some could come back to work, so the previous initiative of consolidating work renewed, while at the same time the firm continued its decade-long process of laying off employees, now many who worked remotely. The objections from employees and recently laid off employees proved intense since at least 2018, so a familiar refrain. One programmer commented, “Yep, IBM’s ‘Back to the Labs’ mandate convinced me to retire in 2018. So their ‘real’ program worked in my case.” Another in 2020 opined that, “this struck me as a disguised layoff targeting people in their 40s and 50s. Also hit people in their 30s.” Another framed the issue differently: “Let’s be honest. IBM had too much real estate it could not sell so it forced employees back to the office and lost thousands of outstanding professionals.” Others defended remote work: “I worked from home for IBM my last 18 years. My productivity was through the roof. They forced us back into open area office space my last 2 years,” and as a result, “tanked moral and IBM has lost a lot of good people.”

While employee suspicions were probably more true than not, also that there was growing evidence that people working in proximity to each other did improve creativity—important as innovation in AI and other forms of IT were needed—IBM’s senior management had lost too much credibility for the press and so many employees opining on websites. These decisions to consolidate back to offices were made largely during the tenure of CEO Virginia Rometty.

IBM was not alone in its long journey through remote work. Just as companies took notice in 2017 and again in 2020 when IBM reversed its remote working models, other companies and government agencies tried to learn more. But some also had a history of working remotely. Office and “high tech” staffs explored possibilities of remote work, largely beginning in the 1970s. One voice that solidified much of the early thinking around why and how to do this was Jack Nilles, known widely as the “father of telecommuting,” who spent the bulk of his career as an engineer associated with aircraft and rocket projects and the 1970s as Director for Interdisciplinary Research at the University of Southern California. Since the 1980s he consulted about telecommuting, authoring books and articles along the way. His publications explained the concept and how to implement it. He helped spread the word among large American corporations about the benefits of remote work, often referencing other initiatives, including IBM’s.

When IBM announced again in June, 2020—as the pandemic was spreading faster than in the previous two months—that it was retreating from remote work, press coverage was of surprise, just as other companies were debating whether to extend remote work beyond Covid-19. Employees and recently retired IBMers complained again, while press coverage expressed puzzlement. IBM felt compelled to trot out its chief medical officer, Dr. Lydia Campbell, to explain. CNN quoted her, “I think we realize at IBM and what most large employers realize is that this pandemic is going to make us all think about new ways of working and different approaches to work.” (Vassel, 2020).  By the end of the summer IT workers all over the world were complaining, often refusing to work at any company campus. Management in such firms had no choice but to let their employees work remotely.

Other companies were moving ahead, experimenting, as it was a new way of working for them. The Society for Human Resource Management announced that two-thirds of American companies were working remotely for the first time. CNN reported that Silicon Valley was notorious for resisting attempts to allow their employees to work remotely, and so had little experience, certainly not to the extent as IBM. One compelling reason for the change that struck Silicon Valley workers had dawned on IBMers as early as the 1990s: “It makes no sense paying Bay Area rent if we can earn our salary living elsewhere.” Members of the IT industry were moving in that direction, despite IBM’s opposite march. That they were not following IBM’s lead was an exception to decades of seeing Big Blue as innovative and progressive.

History’s Insights

Historians can draw several lessons from this story. The history of mobile working is fraught with complexity in implementation, impact on work productivity, employee morale, role of technology, and company culture. Each of these categories of consequences had varied supporters and critics. Noble and sinister motivates were apparent and hidden. How management worked changed, while the career and political power of employees did too in myriad ways. It is a history that we know very little about, but that goes to the heart of how organizations functioned in the last third of the twentieth century and will probably for years to come. It warrants the attention of sociologists, business management, and historians.


Bibliography

Cortada, James W. (2019). IBM: The Rise and Fall and Reinvention of a Global Icon, MIT Press.

Global Workforce Analytics. (2020). Latest Work-At-Home/Telecommuting/Mobile Work/Remote Work Statistics.

Messenger, Jon C. (2019). Telework in the 21st Century: An Evolutionary Perspective, Edward Elgar.

Meyers, Glenn E. (1999) IBM Field Engineering Experiences: A Personal Memoir, IEEE Annals of the History of Computing, 21. No. 4, 72-76.

Mullen, Regina. (June 2, 2017).  IBM Shutters Remote Work: Should You Too?, Replicon blog.

Nilles, Jack M. (1994). Making Telecommuting Happen: A Guide for Telemanagers and Telecommuters, Van Nostrand Reinhold.

__________ (1998). Managing Telework: Strategies for Managing the Virtual Workforce, Wiley.

__________ (2007). The Telecommunications-Transportation Tradeoff: Options for Tomorrow, BookSurge.

Pardes, Arielle. (May 15, 2020). Silicon Valley Rethinks the (Home) Office, Business.

Sak, John C. (2018). The Computer Guy Is Here!: Mainframe Mechanic, Self Published.

Streitfeld, David. (June 29, 2020). “The Long, Unhappy History of Working From Home,” New York Times.

Useem, Jerry. (November 2017). When Working From Home Doesn’t Work: IBMPioneered Telecommuting. Now It Wants People Back in the Office, The Atlantic.

Vassel, Kathryn. (June 25, 2020). IBM’s Chief Medical Officer: We Won’t Rush to Bring People Back, CNN.

 

[Note: Quoted material from the author's recent survey of IBMers on Facebook.] 

James W. Cortada, (August 2020) “From Telecommuting to Mobile Work: The IBM Experience, 1890s-2020.” Interfaces: Essays and Reviews in Computing and Culture Vol. 1, Charles Babbage Institute, University of Minnesota, 23-34.


About the author: James W. Cortada is a Senior Research Fellow at the Charles Babbage Institute, University of Minnesota—Twin Cities. He conducts research on the history of information and computing in business. He is the author of IBM: The Rise and Fall and Reinvention of a Global Icon (MIT Press, 2019). He is currently conducting research on the role of information ecosystems and infrastructures.


 

Charles Babbage’s Ninth Bridgewater Treatise

Margaret Dykens, MLIS, MS, Curator and Director of the Research Library San Diego Natural History Museum

With preface by Amanda Wick, Interim Archivist, Charles Babbage Institute Archives

(PDF version available for download.)

 

Who was Charles Babbage?

Charles Babbage, Victorian scientist and mathematician, was born on December 26, 1791 to a family of London bankers. Fascinated with mathematics, and especially algebra, he studied the subject at Trinity College, Cambridge. While attending Cambridge, he co-founded the Analytical Society for promoting continental mathematics and reforming traditional teaching methodologies of the time. Many of these methods are still used in some form today in the instruction of algebra.

Following completion of his degree, Babbage worked as a mathematician for the insurance industry. He was elected a Fellow of the Royal Society in 1816 and played a prominent part in the foundation of the Astronomical Society (later Royal Astronomical Society) in 1820. As a member of the Royal Society during the heady days of the early 1800s, Babbage came into contact with a number of great thinkers and engaged in a robust correspondence with fellow mathematicians, naturalists, and philosophers—including Sir William Herschel, Charles Darwin, and Ada Lovelace.

In 1821 Babbage invented the first of his two calculating machines, The Difference Engine, which would quickly become his singular passion and focus. The function of the Difference Engine was intended to compile mathematical tables and, on completing it in 1832, he began work on a more complex and multifunctional machine that could perform any kind of calculation. This was the Analytical Engine (1856) and its invention is widely considered to be the founding of the field of modern computing.

Today, little remains of Babbage's prototype computing machines and, unfortunately, critical tolerances required by his machines exceeded the level of technology available at the time. Though Babbage’s work was formally recognized by respected scientific institutions, the British government suspended funding for his Difference Engine in 1832, and after an agonizing waiting period, ended the project in 1842. Though Babbage's work was continued by his son, Henry Prevost Babbage, after his death in 1871, the Analytical Engine was never successfully completed, and ran only a few "programs" with embarrassingly obvious errors.

Despite his many achievements in mathematics, scientific philosophy, and his leadership in contemporary social movements, Babbage’s failure to construct his calculating machines left him a disappointed and embittered man. He died at his home in London on October 18, 1871.

What’s in a name?

The calculating engines of English mathematician Charles Babbage (1791-1871) are among the most celebrated icons in the prehistory of computing. Babbage’s Difference Engine No. 1 was the first successful automatic calculator and remains one of the finest examples of precision engineering of the time. Babbage is sometimes referred to as "father of computing." The International Charles Babbage Society (later the Charles Babbage Institute) took his name to honor his intellectual contributions and their relation to modern computers.

Where is Babbage in the Archives?

Materials related to Charles Babbage are scattered around the world, with the vast majority of his personal papers and library held at the Science Museum of London and the British National Library. Although the Charles Babbage Institute is named after Charles Babbage, we actually have very little material originating with our namesake. What we do have are first editions of many of his books and journal articles and a number of these are inscribed with dedications to his patrons by the author. These rare materials constitute the earliest materials in our repository and, while used in classroom settings and on exhibit, rarely leave our vault. Our holdings of Babbage’s work include the following:

  • Babbage, Charles. On a Method of Expressing by Signs the Action of Machinery. London: [Royal Society of London], 1826.
  • Babbage, Charles. Reflections on the Decline of Science in England, and on Some of Its Causes. London: Printed for B. Fellowes (Ludgate Street); and J. Booth (Duke Street, Portland Place), 1830.
  • Babbage, Flather, Dodgson, Flather, John Joseph, and Dodgson, Charles. On the Economy of Machinery and Manufactures. London: C. Knight, 1832.
  • Babbage, Charles. Passages from the Life of a Philosopher. London: Longman, Green, Longman, Roberts, & Green, 1864.
  • Babbage, Charles. The Ninth Bridgewater Treatise: A Fragment. Second Edition., Reprinted. ed. Cass Library of Science Classics; No. 6. London: Cass, 1967.

What is the Ninth Bridgewater Treatise?

One of the titles in Babbage’s oeuvre that is uniquely significant is the Ninth Treatise of Bridgewater. This volume presents Babbage’s perspective on the Eight Treatises of Bridgewater—a series of work by multiple influential thinkers of the Victorian era on natural history, philosophy, and theology. Babbage’s contribution is not officially affiliated with the eight-volume series and was merely his own considerations on the topic. In his volume, which he titled the Ninth Bridgewater Treatise, he discusses his calculating machines and posits the idea of God as a divine programmer who established the rigid natural laws which govern humanity and civilization, in many it presents a case for Deux et Machina.

As a fragmentary piece, and one that does not dwell on mathematical or scientific subjects, this is rarity amongst Babbage materials. Our copy is a second edition and, while in excellent condition, it is not especially rare. Recently, the Curator and Director of the Research Center at the San Diego Natural History Museum, Margaret Dykens, experienced one of those once-in-a-lifetime finds when she reviewed an anomaly within their catalog, an edition of Babbage’s Ninth Treatise of Bridgewater that seemed to be a galley proof. As she notes in the following article, deep examination of the item by both herself and noted Babbage scholar, Dr. Doron Swade, made several incredible finds.


Bibliography

Charles Babbage Institute. (10 June 2020). “About Charles Babbage.” Charles Babbage Institute web site. http://www.cbi.umn.edu/about/babbage.html.

Swade, Doron. (12 June 2020). "Babbage, Charles (1791–1871), mathematician and computer pioneer." Oxford Dictionary of National Biography. 23 September 2004. https://www.oxforddnb.com/view/10.1093/ref:odnb/9780198614128.001.0001/odnb-9780198614128-e-962.

 

Amanda Wick (July 2020). “Charles Babbage’s Ninth Bridgewater Treatise.” Interfaces: Essays and Reviews in Computing and Culture Vol. 1, Charles Babbage Institute, University of Minnesota, 17-22.


About the author: Amanda Wick is the interim archivist at the Charles Babbage Institute Archives (CBIA) at the University of Minnesota. Prior to working at CBIA, Amanda led major processing projects at the University of Minnesota and managed the archives of the Theatre Historical Society. She obtained her Bachelor’s degree in Environmental Studies from Lawrence University (Appleton, WI) and her Masters in Library and Information Science from Dominican University (River Forest, IL).

 

Charles Babbage’s Ninth Bridgewater Treatise in the SDNHM Library

Margaret Dykens, MLIS, MS, Curator and Director of the Research Library San Diego Natural History Museum

Abstract: As a foundational figure in the history of science, Charles Babbage is best known for his contributions to computing. In fact, his mechanical, programmable calculating machines are considered precursors to modern computers. These accomplishments were the primary reason for the naming of the Charles Babbage Institute, and its archivists have sought to honor its namesake through the purchase of rare books authored and inscribed by him. One such book is a fragmentary oddity, the Ninth Bridgewater Treatise, and a copy owned by the San Diego Natural History Museum that was recently examined by curatorial staff and prominent Babbage scholar, Dr. Doron Swade, holds curious clues to Babbage's approach to natural philosophy. (KW: Babbage, Charles; Swade, Doron; computing history; rare books; antiquities; archives.)

 

Image of Treatise with hand-written pencil annotations

The Research Library of the San Diego Natural History Museum (SDNHM), founded in 1874, has extensive holdings of rare and antiquarian books, including natural history volumes dating back to 1514. The majority of these books were donated by various naturalists and philanthropists over the past one hundred years. One such naturalist was General Anthony Wayne Vogdes (1843-1923), a career Army officer with an active secondary career as a geologist and paleontologist. Vogdes was also an avid bibliophile and donated his extensive scientific library to the SDNHM after his death in 1923. One of the books from Vogdes’ library was a first edition of Charles Babbage’s Ninth Bridgewater Treatise (1837).

This particular volume was mentioned in a newspaper article published on January 11, 1896 in the San Francisco Bulletin, which described many of the most important books in Vogdes’ personal library. Babbage’s Ninth Bridgewater Treatise is mentioned in the list with the comment that it contained “annotations by the author.” The book in question appears to be a galley proof with wide margins and many hand-written pencil annotations, as well as marginalia likely written by the author.

hand-written letter bound into the book

There is also a portion of a hand-written letter bound into the book itself—Vogdes was an amateur book-binder and his library consists almost exclusively of his own bindings, many of which have notes, letters, images, or other memorabilia that he collected and bound into the text.

I was intrigued by the hand-written annotations and marginalia in Vogdes’ copy of the Ninth Bridgewater Treatise and contacted Dr. Doron Swade, preeminent Babbage scholar and retired curator of the Charles Babbage collection at the Science Museum of London, for verification of the handwriting. After emailing Dr. Swade several images of the annotations, he replied to me that it was highly likely that they were in Charles Babbage’s own hand, both because of the style of writing as well as the content itself.  To quote Dr. Swade:

Having gone through the 7,000 manuscript sheet (ms) of Babbage Scribbling Books the handwriting in what is visible on the folded manuscripts interleaved on page 128, and in the third image, looks very much like Babbage’s, as do the pencilled annotations.

But there is stronger evidence for the annotations and ms being his: in the preface ‘advertisement’ to the second edition Babbage states that the chapter ‘On Hume’s Argument Against Miracles’ has been ‘nearly rewritten’. The first image you sent with the pencilled annotations, which are surely from the first edition, correspond to changes made in the second edition. It is not credible that anyone other than Babbage would have made what are essentially editorial instructions, and editorial amendments, that were carried through to the second edition.

There is even more conclusive evidence in the sample page 131 where the pencilled annotations appear verbatim in the second edition, and the several pencilled deletions have also been carried through.

The ms in the third of the images you sent starts with the same opening sentence that appears in the second edition at the top of page 127 though what follows has been edited and amended. It could be that this is a sheet from the original manuscript for the first edition though not having access to a first edition I am unable to confirm this.

It is fair to conclude that the annotations are Babbage’s. It is difficult to see any other explanation.

Image of Treatise

Although I do not know how General Vogdes came to have this particular annotated first edition of the Ninth Bridgewater Treatise in his collection, I am not surprised as his entire library constituted over seven thousand scientific volumes on topics related to geology, paleontology, and other scientific and philosophical disciplines. Indeed, his personal library included works by Darwin, Hume, Dana, Agassiz, and Lyell as well as many other well-known natural historians and intellectuals.

We are hopeful that this unique source might be of interest to some Babbage researcher or historian. Any scholars interested in pursuing this topic further should feel free to contact me directly at the SDNHM Research Library.


Bibliography

Swade, Doron, Dr. “’Ninth Bridgewater Treatise.’ Message requesting assistance in authenticating possible rare volume by Charles Babbage.” Message to Margaret N. Dykens. January 2020. E-mail.

 

Margaret N. Dykens (July 2020). “Charles Babbage’s Ninth Bridgewater Treatise in the SDNHM Library.” Interfaces: Essays and Reviews in Computing and Culture Vol. 1, Charles Babbage Institute, University of Minnesota, 17-22.


About the author: Margaret N. Dykens received her Master’s degree in Biology from the College of William and Mary, Williamsburg, Virginia in 1980. Upon graduation, she was hired as Staff Illustrator at the Harvard University Herbarium. Margaret went on to earn a second graduate degree in Library Science from the University of Michigan School of Information in 1993. In 1997, she became the Director of the Research Library for the San Diego Natural History Museum (SDNHM). In addition to her work directing SDNHM, she has served as curator for two exhibitions; the first was The California Legacy of A.R. Valentien, based on the Museum’s fine art collection, where she toured with this exhibition to numerous venues across the U.S. In 2016, she also curated the permanent exhibition, Extraordinary Ideas from Ordinary People: A History of Citizen Science, based on fine art works, historical objects, and rare books from the Research Library.


 

Of Bugs, Languages and Business Models: A History

Alejandro Ramirez, PhD, Sprott School of Business – Carleton University

Abstract: A series of wrong decisions precipitated the Y2K crisis: adopting the 6-digit date format, using COBOL as the standard in business computing and discontinuing COBOL-teaching in many American universities shortly after it was adopted. Did we learn anything from this crisis? (KW: Y2K crisis, COBOL, Internet history, Outsourcing.)

(PDF version available for download.)

 

Y2K Time magazine cover
TIME magazine addressing Social misunderstanding (Courtesy of the of Charles Babbage Institute Archives)

Introduction

Twenty years ago, we averted the Y2K crisis. When we talk about the crisis, people are genuinely puzzled that it was a very expensive affair. They have a distorted idea about a crisis that did not happen, or how it was supposed to be the end of the world, but at the end, nothing actually happened. Then they wonder if something similar could happen again. That is really the crux of the matter: what did we learn from the Y2K crisis?

Knowing the history of this crisis is an important and serious endeavour. It is a benefit to understand how computer usage evolved, and what forces shaped our technology, our practices, and computers’ contribution to society. History becomes an indispensable light guiding us in this understanding.

What were they thinking?

Employment of personnel to use computers in businesses became widespread in North America with the introduction of the IBM 1401 in 1959. Before, if any, machine-based data processing generally was executed by electromechanical accounting machines. Calendar days, if needed, were fed via punched cards, indicating the date appropriate for that job. When programmers from the late 1950s to the mid-1960s decided that in order to save on memory costs (McCallum 2019), they will use only the last two digits of the year, i.e., 60 instead of 1960, they never imaging that their programs will still be running at the end of the 20th Century. After all, 40 years seemed a very long time, especially since they were saving approximately $16.00 USD per date by saving two bytes, 16 bits, of core memory valued at about one dollar per bit.

When IBM announced their new, more powerful System/360, with many innovative features compared to their 1400s technology, they also decided—in the interest of compatibility—that their system’s date would also be a 6-digit date. To cement this practice, on November 1, 1968, the U.S. Department of Commerce, National Bureau of Standards, issued a Federal Information Processing Standard which specified the use of 6-digit dates for all information exchange among federal agencies (FIPS 1968). The standard became effective in January 1, 1970, enshrining the 6-digit date standard by the bureaucracy of government, also with little to no thought of the year 2000.

It took about fifteen years for someone to realize that having a 6-digit date may be a problem. Unknown to most but a few programmers, Jerome and Marylin Murray published their call to arms Computers in Crisis: How to Advert the Coming Worldwide Computer Systems Collapse in 1984. They credited their daughter Rosanne, a senior research analyst at Systemhouse, Ltd., of Ottawa for the origins of the book: “This book may not have been undertaken were it not for a lengthy telephone discussion of the dating problem with Rosanne…Her interest and encouragement have been unflagging” (Murray & Murray 1984, p. xix).

Shortly after the book was published, Spencer Bolles posted on January 18, 1985, from his computer in Reed College in Oregon, the first recorded mention of the Year 2000 problem on a Usenet group: “I have a friend that raised an interesting question that I immediately tried to prove wrong.  He is a programmer and has this notion that when we reach the year 2000, computers will not accept the new date” (Bolles 1985).

Millennium bug guide Few were able to distinguish facts from fiction (Courtesy of Charles Babbage Institute Archives)
Few were able to distinguish facts from fiction (Courtesy of Charles Babbage Institute Archives)

Both, Spencer L. Bolles’ unnamed friend and Rosanne Murray seem to be the ones that started to worry about this issue. They are the ones that people do not know about, and perhaps with them, many others. But the prominence of the problem was best known once David Eddy called it “Y2K” (Rose 1999).  Before Eddy’s acronym, the problem was referred to as Century Date Change (CDC), Faulty Date Logic (FADL), Millennium Bug. For Eddy, “Y2K just came off my fingertips” explains Rose (1999).

Reading Murray and Murray (1984), it becomes apparent that regarding the state of computer resources in business not much has changed. The following paragraph can be found in today’s stories regarding the use of computers in organizations: “All this explodes in the midst of a world economy totally dependent upon computer resources for its survival and demanding of the services of skilled technical personnel whose availability is in dreadfully short supply” (Murray & Murray 1984, p. 221). In 1984 the comment was more about the skills needed to debug all those programs used by organizations using 6-digit dates. No one knew exactly how many programs needed fixing. The task was clear: “Our fault is in our readiness to ‘patch’ or treat symptoms until it is often too late to successfully eradicate the disease” (Murray & Murray 1984, p. 334). Its magnitude was not.

Cost-savings in core memory and using a 6-digit date standard imposed by government are pieces of evidence in the story of why the Y2K Bug was so problematic. To completely understand its ramifications, it is necessary to briefly talk about how Common Business Oriented Language (COBOL) became the Lingua Franca of business programming.

In answering the question many of us have asked regarding what programmers were thinking when they created their programs, it is important to remember that one works within the reality we create and within the space of actions allowed in a given domain (Heidegger 1993). Programmers must comply with current standards when writing their code. The 6-digit date was one of them.

Washington, we have a problem

At the beginning of the second half of the twentieth century, the computer started to move away from being only for science and engineering as more large corporations and government organizations adopted computers to become “the most vital tool of management introduced in this decade” (Lohr 2001, p. 44). It was entering into accounting, payroll, logistics, manufacturing and purchasing. But at that time programs were still a foreign language to managers and the need to use a language to solve management and business problems became essential.

As its name indicates, COBOL was a Common Business Oriented Language. It had everything to be rejected: it was a language designed by a committee, it was not intellectual enough to entice computer scientists to be interested in it, it was created for practical reasons and agreed upon computer vendors. Could there be a common language to have the business community working together? That was the premise in April 1959 for a meeting at the University of Pennsylvania computer centre.

The same year, in late May, the Pentagon hosted a 2-day meeting attended by most of the computer makers and some heavy computer users in businesses, a group of about 40 people. From this, and subsequent meetings, a ‘short-term’ committee of six was formed. Some names for this language were discussed, among them: Busy (Business System), Infosyl (Information System Language), Datasyl (Data System Language), and Cocosyl (Common Computer Systems Language). According to Grace Hopper, it was Robert Bremer who proposed the name Common Business Oriented Language (COBOL). The committee delivered “the business FORTRAN,” it was now up to vendors to adopt it and have their machines ready to be delivered to organizations COBOL-ready.

COBOL the language of Business applications (Public Domain)
COBOL the language of Business applications (Public Domain)

 

Computer vendors worldwide decided to use COBOL as the language for their business clients. And without questioning, businesses worldwide started to use COBOL as the language for developing any business solution. The language was simple enough that programmers were able to learn it quickly and join in the effort and suddenly, the business world was running on COBOL.

It is important to note that computer scientists were not particularly interested in a language created by a committee, but they saw the need to teach it in the curriculum for a few years, until more interesting academic languages were designed and attracted their attention: C, Algol, Pascal, C++. Slowly, COBOL courses started to dwindle in many universities, to a point that by the mid-1980s, there were not many computer science students who knew how to program in COBOL. It was clear that COBOL was the business FORTRAN, a huge achievement, especially in a field where users care more about their discipline (business) than about computers. It became clear that computers were useful tools, but tools, nonetheless.

The Achilles’ heel of COBOL was not its syntax but, its heavy reliance on the highly adopted 6-digit date standard to manipulate time-based business data. Once it became evident that many COBOL programs were vulnerable to the Y2K bug, the call for action became louder. In Murray and Murray’s own words: “This is a true computer crisis—it is application and nationality independent. It is worldwide. It has but one certain remedy—immediate action to terminate 6-digit date involvement in current system development and the scheduling of the ultimate conversion of existing systems” (Murray & Murray 1984, p. 334).

But, could this call to arms be enough to mobilize companies all over the world to find a way to solve the problem? For that, it was needed to estimate its costs. The bottom line was simple: what is more expensive, the problem or its cure?

As an example, Lyons (1981) talks about one Fortune 500 company, based in Chicago, that had a library of 50,000 COBOL programs. Each program with an average of 750 lines, giving an estimated 37,500,000 lines of code in that library. Having a hypothetical programmer able to debug 100 lines of code per day, that will leave him with 42.8 years working every single day, seven days a week. To finish the job in one year, you need to find another 42 equally efficient programmers and coordinate them to do the job! This, for one library, in one company! It was immediately clear that the job was momentous.

Suddenly the need for COBOL talent was clear, but there were not many new computer science students in the pipeline that knew COBOL. Most COBOL programmers were working in the field performing their daily tasks. They couldn’t be asked to stop working and start debugging. There were few COBOL professors left in a handful of universities that over night received job offers to head and manage COBOL teams tasked with debugging programs. It became impossible for universities to keep that unused talent in house, especially considering the salaries offered to them. Those organizations that did not want to enter the bidding salary wars for COBOL talent and decided to develop their own. Three years before the year 2000, according to a CIO quoted by Callaway (1997), “A good mainframe programmer today is worth his [or her] weight in gold.”

The Y2K bug was a costly undertaking, but it was vital to tackle it. Arranga and Price (2000) using a pop culture metaphor indicate the importance of the Internet, and the emerging World-Wide-Web in solving the Y2K problem: “Perhaps the most important Y2K side effect provided Cobol with something it had always lacked: a broad-based community of developers focused on providing Cobol with options. With too much code to be repaired manually, a market of software remediation tools came of age. The tools did not turn into pumpkins at the stroke of midnight. Instead they kicked off the other shoe, took off the gown, put on a Spidergirl outfit, and swung onto the Web” (p. 18).

Outsourcing: Solution or Problem? (This Photo by Unknown Author is licensed under CC BY-NC-ND)
Outsourcing: Solution or Problem? (This Photo by Unknown Author is licensed under CC BY-NC-ND)

The Internet, Y2K, and Outsourcing to India

Internet adoption all over the world became the norm. India was among the biggest adopters. Thomas Friedman in his 2005 best seller The World is Flat, includes outsourcing as the fifth, among ten ‘flatteners’ of his model explaining how the modern world became flat. In the story Friedman tells about outsourcing, the principal character is none other than the Y2K bug. According to him, suddenly, India became linked to North America through fiber-optic, just at the time that the Y2K bug became an urgent reality at one end of the link, India had a surplus of COBOL programmers at the other. “And so with Y2K bearing down on us, America and India starting dating, and that relationship became a huge flattener, because it demonstrated to so many different businesses that the combination of the PC, the Internet, and fiber-optic cable had created the possibility of a whole new form of collaboration and horizontal value creation: outsourcing” (Friedman 2005, p. 108).

If the Y2K bug was a big problem for the West, it became a big opportunity for India. Suddenly every single computer running COBOL programs in the West needed to be reviewed. The enormous task of checking programs line-by-line required an equally enormous number of qualified people, these people were available in India. This gave Indian IT companies the opportunity to work side by side with the largest Western corporations. Before Y2K, India was producing many IT professionals hoping that they will find a position somewhere in the west. And many did, but thanks to the Y2K bug, suddenly these IT professionals did not need to leave India to work for these corporations.

Outsourcing from America to India, as a new way to collaborate exploded after the year 2000. Remember, soon after the year 2000, the dotcom bust shocked many companies that had emerged around the time of the Y2K. This bust, a problem for many start-up companies in North America, was another opportunity for Indian companies. Since these companies were already linked to the West, the bust made the cost of using these links virtually free.

Lessons…learned?

Shortly after it was clear that the Y2K bug problem had been solved, Kappelman (2000) made an evaluation of what he called “Some Strategic Y2K Blessings.” He starts by listing a series of issues that became evident to the profession while working on solving the problem: “Y2K showed everyone the importance of systems and software in enterprise success…the value of inventorying and tracking IT assets, maintaining standards for software processes including careful version documentation, quality testing practices and independent validation, simplicity in software and systems, clearly defining project management, and a reasonable balance between centralization and decentralization” (p. 42). Somehow, along the way, working on the solution, other issues became more prominent and it seems that the profession forgot about those standards that were imposed in their practice, i.e., using a 6-digit date standard, and using COBOL as a de facto language for business applications. In hindsight adopting those standards was more a problem than a solution.

Kappelman (2000) indicates that these projects “were extensive and complex” and due to “lamentable software practices” correcting them was “very difficult” (p. 42-43). Those projects consumed more than half of the IS operating budget. He indicates that the Y2K total global cost was between $375 and $750 billion dollars (p. 43). What was remarkable, was to find out that “COBOL accounts for approximately 34% of all applications although only approximately 16% of all professional programmers work with it” (p. 34). It is really unbelievable that with such a disadvantage, the world was able to avoid major consequences caused by the 6-digit date standard. We can think of the Y2K issue as a major technological “spring-cleaning affair” just before the new millennium. It was an enormous cleaning task! (See Table 1)

Magnitude of the Y2K Mess

Outsourcing became the new frontier for American companies. Even though it has declined, it is now one among many solutions companies use to become more competitive. Looking for ways to cut cost by taking jobs away from middle class Americans became a liability. Outsourcing was a solution to a lack of talent problem at that time; regardless, some companies abused it by following the mantra, “I better start outsourcing as many functions as I can…so what can be outsourced must be outsourced” (Friedman 2005, 135), even if talent was available in America.”

Outsourcing became an ‘India versus Indiana’ situation. Friedman (2005) describes a real-world example. When in 2003, the state of Indiana decided to move its unemployment claims to a computer system due to the increase of such claims due to firms that were outsourcing. To do so, it put out to bid a contract. The contract was won by Tata America International, the US-based subsidiary of India’s Tata Consultancy Services Ltd., located in New York City (Friedman 2005, p. 240). Its bid was $8.1 million lower than the closest one, a joint effort of Deloitte Consulting and Accenture Ltd. No firms from Indiana bid for the contract. An Indian company won, and the work was outsourced to their Indian headquarters. But what is ironic, is that Indiana was outsourcing the department that was responsible for dealing with the claims of the Indiana workers affected by outsourcing!

Sadly, the story does not end there. When the details of the contract became public, Republicans made it a campaign issue. It became a political nightmare for the Indiana Governor, a Democrat. The contract was cancelled at the end, Tata received almost one million dollars to cover some expenses incurred. A new set of smaller contracts were put out to bid, so Indiana firms could compete. At the end, the contract became more expensive (and inefficient), but it kept the politics at bay.

Alan Blinder (2005) decided to make, what he called, a bold prediction: “In the future, and to a great extent already in the present, the key distinction for international trade will no longer be between things that can be put in a box and things that cannot. It will, instead, be between services that can be delivered electronically over long distances with little or no degradation of quality, and those that cannot. The tradability of a vast array of services is, as they say, the New New-Thing. And there is little doubt that the fraction of services that can be delivered electronically will grow” (p. 6). That perhaps is the lesson that was hardest to learn, but organizations learnt it very well.

Conclusion

Twenty years after the Y2K problem was solved, information systems have, more than ever, a pervasive presence in every organization. It seems those systems are not only adopted, but they are transforming, sometimes, those organizations to a point that the new trend in businesses is digital transformation. In these twenty years, what have we learned? Are we confident that we will not face another doom’s day because of the programs running our firms?

It is difficult to assess if that is the case. It is clear that the success stories of the first two decades of the 21st century are computer-based business models; Google is the new library, Amazon is synonymous with online shopping, Twitter is the new Fourth Estate, and Facebook is the new Commons. Let’s not forget that these companies rely on the algorithms that make them what they are. Those algorithms were coded in computer languages, some of them use directly or indirectly COBOL sources.

The echo of Santayana’s (1905) maxim “those who cannot remember the past are condemned to repeat it,” is constantly present. It is important to learn the lessons of the Y2K bug. What the programmers were thinking when creating their applications in the late 1950s and early 1960s. How COBOL became a standard in business computing. How the Internet came to the rescue. How outsourcing, as a business model, emerged. All of these important reminders of depending on technology mediating human relations.

What can we learn from the Y2K crisis? By tracing the contribution of those, who by becoming aware of a looming crisis, rang the warning bell to save us all, those who proposed a solution, those who implemented it, those who quantified it, and those who took advantage of new business models emerging from that crisis. It is clear for us that this story of bugs, languages and business models is a way to keep the history of a crisis averted, a crisis that some call Y2K: The bug that failed to bite (Story & Crawford 2001) ignoring the large amount of evidence that if the bug didn’t bite is because the systems were debugged and there were few instances where it was not fixed, such as the welcome panel at Nantes’ Central School on January 3, 2000.

By Bug de l'an 2000 - Own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=14719963
By Bug de l'an 2000 - Own work, CC BY-SA 3.0, https://commons.wikimedia.org/w/index.php?curid=14719963

If we ignore the evidence and forget the history of this event that united business and computer professionals, then we most likely will face a similar situation. Another by-product of this lesson is to see how some solutions to faced problems become problems themselves, i.e., outsourcing. Students particularly need to learn this history, be inspired by it, and be motivated to continue the work of building safer and better systems. Managers as well can learn about understanding the systems they adopt, and being proactive to have in house, skills needed to guarantee that such systems run impeccably.


Bibliography

Arranga, Edmund C. and Wilson Price. (March-April 2000). Fresh from Y2K, What’s next for Cobol? IEEE Software, Focus – Guest Editors’ introduction, 16-20.

Blinder, Alan S. (December 2005). Fear of Offshoring, Princeton University Center for Economic Policy Studies, Working Paper, 119.

Bolles, Spencer. (January 18, 1985). Computers Bugs in the Year 2000. Newsgroup: net.bugs. Usenet: 820@reed.UUCP.

Callaway, Erin. (March 24, 1997). COBOL comeback, PC Week, 131+, https://link.gale.com/apps/doc/A19246435/CPI?u=ocul_carleton&sid=CPI&xid=c2d04124

FIPS. (1968). Federal Information Processing Standards Publication 4. US Department of Commerce, National Bureau of Standards.

Friedman, Thomas. (2005). The World is Flat: A brief history of the Twenty-First Century, Farrar, Straus and Giroux.

Heidegger, Martin. (1993). The question concerning Technology, in D. F. Krell (Ed.) Basic Writings: Ten Key Essays, plus the Introduction to Being and Time, Revised & Expanded Edition, Harper Collins:307-341.

Kappelman, Leon A. (March-April, 2000). Some Strategic Y2K Blessings, IEEE Software, 42-46.

Lohr, Steve. (2001). Go To: The story of Math majors, Bridge players, Engineers, Chess wizards, Maverick Scientists and Iconoclasts – The Programmers who Created the Software Revolution, Basic Books.

Lyons, M.J. (1981). Salvaging your software assets (tools-based maintenance). AFIPS Conference Proceedings, 50, National Computer Conference, AFIPS Press.

McCallum, J. C.  Memory Prices 1957 – 2019 available at https://jcmit.net/memoryprice.htm

Murray, Jerome and Marylin Murray. (1984). Computers in Crisis: How to Advert the Coming Worldwide Computer Systems Collapse, PBI.

Rose, Ted. (December 22, 1999). “Who Invented Y2K and why did it become so Universally popular?” The Baltimore Sun.

Santayana, George. (1905). The Life of Reason: or, The phases of human progress, Volume 1, Scribner.

Story, Jonathan and Robert J. Crawford. (2001). Y2K: The Bug that Fail to Bite, Business and Politics: 3, (3), 269-296. DOI: 10.1080/13695250120104515

 

Alejandro Ramirez (June 2020). “Of Bugs, Languages and Business Models: A History.” Interfaces: Essays and Reviews in Computing and Culture Vol. 1, Charles Babbage Institute, University of Minnesota, 9-16.


About the author: Alejandro Ramirez is an Associate Professor at the Sprott School of Business – Carleton University in Ottawa, Ontario, Canada. He has a PhD in Management – Information Systems (Concordia), an MSc. in Operations Research & Industrial Engineering (Syracuse), and a BSc. In Physics (ITESM). He has been active with the Business History Division of ASAC since 2012 and has served as Division Chair and Division Editor. He is interested in the History and the stories of Information Systems in Organizations. Currently working on a New Frontiers in Research funded project “Imagining Canada’s Digital Twin” with colleagues. (Alex.Ramirez@Carleton.ca).


 

Where Dinosaurs Roam and Programmers Play: Reflections on Infrastructure, Maintenance, and Inequality

Jeffrey R. Yost, Charles Babbage Institute, University of Minnesota

Abstract: This short essay examines two temporally separated crises (current unemployment system failures and Y2K), focusing on connections between infrastructural (largely COBOL-based) IT systems, maintenance, and societal inequality. (KW: computer history, unemployment system infrastructure, maintenance, COBOL, Y2K, inequality).

(PDF version available for download.

Grace Hopper

Rear Admiral Grace Murray Hopper was an unparalleled leader in the early software field. In addition to her pioneering work with the A-0 compiler, her FLOW-MATIC was particularly influential. More than any other language, FLOW-MATIC provided a model for the COBOL development team. (Image: United States Navy) 

In March 1959 Burroughs Corporation computer scientist Mary Hawes called for an industry and government consortium in order to develop a standard programming language for business—promoting greater portability for organizational users transitioning mainframe computers. With appearances of Autocode, FLOW-MATIC, FORTRAN, ALGOL-58, and other 1950s programming languages, she recognized the high costs of proliferation.

Jean Sammet

Jean Sammet, COBOL Co-developer. She was the first woman president of ACM, a visionary leader. Sammet had a long and distinguished career at IBM. Despite prolific achievements and stellar intellectual and managerial contributions to her company and her field, she was not named an IBM Fellow, an honor granted disproportionately to men—approximately 90% of the 275 IBM Fellows are male (Image: Charles Babbage Institute Archives)

The following month, Hawes’ call evolved into the founding Conference on Data Systems Languages (CODASYL), sponsored by the U.S. Department of Defense (DoD). CODASYL’s ongoing efforts drew inspiration from Sperry-Univac’s Grace Murray Hopper, her FLOW-MATIC, and her advocacy for languages approximating English syntax. For these important contributions, Hopper is sometimes referred to as the “Mother of COBOL,” (Common Business-Oriented Language). Despite some crediting her as its “inventor,” a CODASYL committee of six—Howard Bromberg (RCA), Howard Discount (RCA), Vernon Reeves (Sylvania), Jean Sammet (Sylvania, joined IBM in 1961), William Seldon (IBM), and Gertrude Tiernery (IBM)—developed COBOL. DoD published COBOL 60 specifications in January 1960, eight iterations followed, the most recent in 2014. It began as a standard for the DoD and became a standard of American National Standards Institute (ANSI), and International Organization of Standardization (ISO). Today, though still widespread global technologies, common descriptors for mainframes and COBOL are dinosaurs and dinosaur code.

Still Coding After All These Years

To highlight COBOL’s staying power, and perhaps glimpse into its future, in 2014 the Defense Contract Management Agency (DCMA) stated it was not looking to replace its system composed of two million lines of COBOL code (handling 330,000 contracts worth $1.2 trillion), but re-upping on COBOL. DCMA put out a statement “bragging” its new COBOL system would “probably be around for another 20 to 30 years” (Mazmanian 2014).

Back in 2004, IT research firm Gartner, Inc. had estimated there were two million programmers knowledgeable in COBOL—eight percent of all programmers globally—but that the number was decreasing at five percent a year (King, 2020). 

Today, almost half of banks in the U.S. run systems programmed in COBOL, and 95 percent of all ATM transactions rely on COBOL (Allyn, 2020)—trillions every day. Even in normal times, demand for COBOL experts exceeds supply.

Early Quincy ATM by Burroughs Corporation

Early Quincy ATM by Burroughs Corporation. COBOL code was and remains the backbone of ATM transaction processing. (Image: Charles Babbage Institute Archives).

Feeding the Beast

From the 1960s into the 1990s, many universities offered COBOL courses, as did companies and vocational schools like Control Data Institutes. Today, in an age where AI/analytics, games, robotics, cloud, and the internet of things are foremost for many computer science students, few consider learning legacy systems and legacy languages. Accordingly, COBOL courses are scarce. A Slate article quoted Prof. John Zeanchock, Robert Morris University, stating just 37 colleges and universities globally have a “mainframe course” on the curriculum. Most schools’ faculty are unable to suggest legacy specialist students/graduates when banks or local governments call. (Botella 2020). In our culture, Innovation is revered, and maintenance is not. In IT there is a myopic attention to the latest tech and a failure to recognize and value that IT maintenance requires great skill and can be innovative (new processes, new fixes, etc.). Privileging innovation over maintenance is also in part tied to gender stereotypes and discrimination as historically women have had greater opportunity in the critical areas of services, maintenance (both machines and debugging), and programming (from plug board to languages), and fewer opportunities in computer and software engineering (Yost, 2011, 2017).

 

Women students at PLATO terminals
Women students at PLATO terminals at University of Illinois in 1963. Women’s participation in CS as majors increased as the field gained traction in the 1960s and 1970s, and percentage of women peaked in the 1980s. Since then there was a sharp decline to a low plateau. This lack of gender diversity holds back CS and IT labor in all specializations, including legacy. (Image: Charles Babbage Institute Archives).

The percentage of women majors in computer science declined sharply the past quarter century—from more than 35 percent in the 1980s to 18.1 percent in 2014, only varying slightly since (nsf.gov/statistics). The reasons are varied, but gender stereotyping, a male dominant computing culture, and educational and workplace discrimination are factors (Abbate, 2012; Hicks, 2017; Misa, 2011). This has furthered labor shortages (all areas, including legacy) and held back computer science. Labor shortages can become all the more profound in times of crisis, including the current health and economic crisis.

More than a Jersey Thing

On April 6, 2020, New Jersey Governor Phil Murphy made a public plea for volunteer “Cobalt” programmers (meaning COBOL) to aid New Jersey and help with glitches to an overburdened unemployment benefits computer system more than 40 years old. New Jersey was having difficulties with timely processing of unemployment payments to the flood of new filers. The increased burden (volume and parameters) on the unemployment system was a major bottleneck, or to borrow Thomas Hughes’ term, reverse salient, to timely and accurate data processing to respond to those in need (Hughes, 1983).

This sparked an onslaught of journalist articles as well many Twitter, Facebook and other social media posts. The critiques ranged from Governor Murphy/New Jersey having an antiquated unemployment insurance computer system to calling for volunteers from a population segment that would undoubtedly be the most susceptible to COVID-19 risk—the elderly. Meanwhile, social media erupted with jokes with ageist images of elderly individuals as potential volunteers.

Control Data Institutes (CDI)
Both university courses and those at IT vocational schools like Control Data Institutes (CDI) were critical to teaching a generation of COBOL programmers in the 1960s and 1970s—many now retired. Here we see a CDI classroom in 1967. (Image: Charles Babbage Institute Archives)

Other states, including Connecticut and Kansas, had similar shortages of trained COBOL experts to confront unemployment insurance system challenges. Understandably, unemployed workers waiting for unemployment benefits are extremely frustrated and angry, expressing as much on the Kansas Department of Labor (KDoL) platform. Much is the matter with Kansas’ system, with its origins in the 1970s, and inadequate updates for flexibility and scale.  In late April, KDoL indicated a timeline where processing could occur by late May (for many that will push the wait to months). For states that have prioritized investing in updating other computer systems, but not unemployment insurance, it amounts to neglecting infrastructure that serves the most vulnerable in society.

Why do so many states have ill-equipped IT systems for unemployment benefits processing? Replacing long existing systems is complex and expensive (hundreds of millions of dollars). Change is also disruptive to existing labor, existing skill sets.  Unemployment systems serve those lacking political power; federal and state governments deprioritize them.  Further, systems (in all their technical, political, economic, and other contexts) become entrenched, or to use Hughes’ concept, gain momentum (Hughes, 1983).  Failures/pressures can redirect momentum, some states scrambled for cloud solutions once systems crashed in April—possibly the least bad option, but also suboptimal timing, new systems and processes on the fly are especially difficult. Regardless, the problem is one of infrastructure—not valuing maintenance, labor, and recipients. It is not merely COBOL versus the cloud, in fact, COBOL can and does integrate with AWS, Azure, and IBM clouds, hybrid cloud is common.

State IT Workers and Hired Guns’ Heroic Efforts

North Texas’ COBOL Cowboys staffing firm, larger IT services enterprises, and COBOL-skilled independent contractors are in great demand. The governors, state DoLs, and state CIOs are doing their best to staff up to address problems.  For the systems analysts, programmers, and other state employees and contractors the hours are long, work difficult, and efforts truly heroic. The Federal CARES Act’s unemployment benefits, PUA/PECU, allows states to extend the duration of benefits, and include those usually not eligible—the self-employed. This adds greatly to both volume and complexity. In my playful title, “play” is used for where work plays/is performed (fewer coders choosing legacy) and to highlight coders’ creativity—in the spirit of CS metaphors like “sandbox” for building (non-live) code.

Global digital divide map

 

As this United Nations’ graphic shows, the global digital divide is profound. Ramifications during a global pandemic are extreme, where digital connectivity influences opportunities to shelter, connect, and safely earn income. This map is not intended to and does not capture the deep digital divide in the U.S., along class and race lines. (Image: Dakman5, granting public domain rights, Wikicommons)

Domestic and Global Digital Divides

In the coming year, the overall percentage of Americans below the poverty line will peak higher than any time in more than 50 years— the impact for African-American, Hispanic, and Native-American populations is particularly severe. The disparity of access to health insurance, banking, loans, and information technology, as well as exposure to risk, and disparity of incidence and mortality with COVID-19, highlights extreme and growing race and class inequality in the United States.

Washington D.C.’s unemployment platform urges benefits filers to use Microsoft Explorer. Microsoft retired Explorer in January 2016, an unsupported version remains for computers, not smart phones. A Pew Research Center 2019 survey showed 54 percent of Americans under $30,000/year income have a computer, while 71 percent have a smart phone. For those making over $100,000, 94 percent have a computer/broadband at home. (Anderson and Kumar, 2019). Only 58 percent of African-Americans have a computer, versus 82 percent for whites. (Perrin and Turner, 2019) In digital, just like education, healthcare, housing, and other infrastructure, there are two Americas.

 

Control Data 3600 mainframe
Unloading a Control Data 3600 mainframe in 1964 at Tata Institute, Bombay, India. Both the IITs and Tata were fundamental IT educational and vocational infrastructure that allowed a major software and services industry to prosper in the 1990s with COBOL Y2K compliance work and much more. (Image: Charles Babbage Institute Archives)

Y2K: Why to Care

An earlier crisis largely involving COBOL, one with a long and visible runway, is both consequential context and instructive to current challenges. About a quarter century ago, governments and corporations began seriously addressing the pending Y2K crisis—caused by two digits for date often in COBOL code—to avert risks to life and the economy, to make it a nonevent.

Investments and global cooperation were key and the International Y2K Cooperation Center played a meaningful role in fostering collaboration. The shortage of programmers knowledgeable in COBOL, and the lower expense and overwhelming volume of code, led to outsourcing to an emerging Indian IT services industry. This lent momentum to this trade, and to a shifting geography in IT work that remains impactful (though corporate decision-makers are accelerating artificial intelligence applications producing further labor transformations, ones detrimental to Indian IT laborers, developments standout ABD sociologist and CBI IDF Fellow Devika Narayan is insightfully analyzing). Gartner Inc. estimated U.S. government and business expenditures were up to $225 billion, a breathtaking sum indicative of costs of putting off maintenance until a time-sensitive crisis. In passing into the new millennium with few major problems, the situation lent credence to two diverging interpretations—that heavy investment in maintenance had been necessary to avert catastrophe, or more common (and less accurate), that it was an overhyped problem leading to squandered funds in preparing, in maintenances fixes. Offshoring saved  money in the short run, but may not have in the longer run, it left a legacy of less and less current, on-shore COBOL expertise (for maintenance, updates, security, etc.), a workforce and talent helpful in global crises, particularly ones in which unfortunate (U.S.) nationalistic tendencies and policies have inhibited international cooperation.

CONNECT and Disconnects

Maintaining infrastructure is important. Anemic IT budgets have not only hurt opportunities to change and move to innovative new solutions, but also to best maintain existing systems and to better assure their ability to perform and to perform to scale in both normal times and crises. The reverse salient certainly is not always COBOL or COBOL alone. State auditors warned Florida Governor Ron DeSantis that Florida’s unemployment site, its “CONNECT” cyberinfrastructure, had more than 600 systems errors in need of fixing, but that state officials had “no process to evaluate and fix.” (Mower, 2020). It was a 2013 $77 million system, which he is quick to point out, his administration inherited. This underlines the challenge not just in Florida but many States—inadequate infrastructure is the predecessors’ fault, is not the current leaders’ problem, and fixes should lie with successors. Often the (now) multi-hundred million-dollar cost typical of major upgrades to new unemployment insurance systems (and ongoing refinement) is difficult without federal assistance. Florida’s CONNECT is a reminder of damaging disconnects, and leaders’ inattention to infrastructure for vulnerable people. The problem is also one of meager and dwindling federal support. Federal aid for state unemployment administration has been dropping for a quarter century with severe cuts in 2018 and 2019. In a survey (pre-COVID-19) more than half of the states responded their unemployment system problems were “serious” or “critical.” (Botella 2020).

Minneapolis Interstate 35W Bridge collapse
Minneapolis Interstate 35W Bridge collapse, August 2007. Physical infrastructure gets far more federal support for states than ethereal software infrastructure. Both require evaluation, audits/checks, and timely maintenance—or they break—for software in the form of crashes, delays, breaches, etc. (Image: Kevin Rofidal, United States Coast Guard. Wikicommons. Public domain USCG Image. 17 U.S.C. § 101 and § 105).

Neglected Infrastructure and Crashes

Working two tenths of a mile from the site of the 2007 Interstate 35 West Mississippi River Bridge collapse in Minneapolis, is a frequent reminder that strong, safe, and well-maintained infrastructure is essential. Twenty-eight percent of infrastructure project funding at the state level comes from federal grants (primarily for physical infrastructure). States’ invisible software infrastructure is starved, especially unemployment systems. Hopefully the COVID-19 pandemic leads not only to evaluating our medical preparedness with ICUs, PPE, and unmet needs in free enterprise insurance and healthcare, but also greater evaluation of IT infrastructures. Ideally, the developments will lead all governors with poor unemployment insurance system performance to the same conclusion as Governor Murphy about the need for post-mortems on digital infrastructure. As he put it “how the heck did we get here when we literally needed COBOL programmers”— learning from the past is important.

History Matters

One thing clear from the two COBOL crises is that history and archives matter—my thoughts here have at best just scratched the surface on fundamental IT infrastructure and contexts someone could analyze with tremendous depth using Charles Babbage Institute resources. CBI’s archival and oral history resources (most transcripts online, all free) to study the Y2K crisis and the history of CODASYL and COBOL (and many other topics and themes in the history and social study of computing) are the finest and most extensive in the world. A talented University of Pennsylvania doctoral candidate in the History and Sociology of Science, Zachary Loeb, has drawn on CBI’s International Y2K International Cooperation Center Records for his important dissertation on the cultural, political, and technical history of Y2K.

Over the years, a number of researchers have used our Conference on Data Systems Languages (CODASYL) Records. While it stands out on documenting COBOL and the group’s work with databases (what occurred in 1959 and far beyond), we have many other COBOL materials in a variety of collections. One such collection (a recent one) is our largest overall collection at more than 500 linear feet, the Jean Sammet Papers—Sammet may have been the single most important developer with COBOL. Likewise, our Frances E. (“Betty”) Holberton Papers has rich material on CODASYL and COBOL. There is also great COBOL content in our Burroughs Corporate Records, Control Data Corporation Records, Gartner Group Records, Auerbach and Associates Market and Product Reports, IBM SHARE, Inc., HOPL 1978, Charles Phillips Papers, Jerome Garfunkel Papers, Warren G. Simmons Papers, National Bureau of Standards Computer Literature, Computer Manuals, and many other collections. COBOL’s history is one of government, industry, and intermediaries’ partnerships, standards, maintenance, labor, gender, politics, culture and much more. In a technical area that always seems focused on the new, new thing, its 60-year past and its continuing presence deserve greater study.


Bibliography

Abbate, Janet. (2012). Recoding Gender: Women’s Changing Participation in Computing, MIT.

Allyn, Bobby. (2020). “COBOL Cowboys Aim to Rescue the Sluggish State Unemployment Systems." NPR, April 22, 2020.

Anderson, Monica and Madhumitha Kumar. (2020). “Digital Divide Persists…” Pew Research Center, May 7, 2020.

Botella, Ella. (2020). “Why New Jersey’s Unemployment System Uses a 60-Year-Old Programming Language.” Slate, April 9, 2020.

Charles Babbage Institute Archives (finding aids to the collections mentioned in final paragraph).

Hicks, Marie. (2017). Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing, MIT Press.

Hughes, Thomas P. (1983). Networks of PowerElectrification in Western Society, 1880 to 1930, Johns Hopkins University Press.

Kennelly, Denis. (2019) “Three Reasons Companies are only 20% Into Cloud Transformation.” IBM.com, March 5, 2019.

King, Ian. (2020). “An Ancient Computer System is Slowing Giant Stimulus.” Bloomberg.com, April 13, 2020.

Mazmanian, Adam. (2014). “DoD Plans Upgrade to COBOL-based Contract System” FCW, July 7, 2014.

Misa, Thomas J., ed. (2011). Gender Codes: Why Women are Leaving Computing, Wiley.

Mower, Lawrence. (2020). “Ron DeSantis…” Tampa Bay Times, March 31, 2020.

Perrin, Andrew and Erika Turner. (2019) “Smartphones Help Blacks and Hispanics Bridge Some—But Not All—Digital Gaps with Whites,” Pew Research Center, August 20, 2019.

Yost, Jeffrey R. (2011). “Programming Enterprise: Women Entrepreneurs in Software and Computer Services,” in Misa, ed. [full cite above].

Yost, Jeffrey R. (2017). Making IT Work: A History of the Computer Services Industry, MIT Press.

Special thanks to CBI Acting Archivist Amanda Wick for discussion/insights on COBOL and our collections.

 

Jeffrey R. Yost (May 2020). “Where Dinosaurs Roam and Programmers Play: Reflections on Infrastructure, Maintenance, and Inequality.” Interfaces: Essays and Reviews on Computing and Culture Vol. 1, Charles Babbage Institute, University of Minnesota, 1 - 8.


About the author:  Jeffrey R. Yost is CBI Director and HSTM Research Professor at the University of Minnesota. He has published six books (and dozens of articles), most recently Making IT Work: A History of the Computer Services Industry (MIT Press, 2017) and FastLane: Managing Science in the Internet World (Johns Hopkins U. Press, 2016) [co-authored with Thomas J. Misa]. He is a past EIC of IEEE Annals of the History of Computing, and current Series Co-Editor [with Gerard Alberts] of Springer’s History of Computing Book Series.  He has been a principal investigator of a half dozen federally sponsored projects (NSF and DOE) on computing/software history totaling more than $2 million. He is Co-Editor [with Amanda Wick] of Interfaces: Essays and Reviews in Computing & Culture.