Interfaces publishes short essay articles and essay reviews connecting the history of computing/IT studies with contemporary social, cultural, political, economic, or environmental issues. It seeks to be an interface between disciplines, and between academics and broader audiences.
2021, Volume 2
The Cloud, the Civil War, and the “War on Coal”
Paul E. Ceruzzi, National Air and Space Museum, Smithsonian Institution
Abstract: The term “The Cloud” has entered the lexicon of computer-speak along with “cyberspace,” the Matrix,” the “ether,” and other terms suggesting the immateriality of networked computing. Cloud servers, which store vast amounts of data and software accessible via the Internet, are located around the globe. This essay argues that this “matrix” has an epicenter, namely the former rural village of Ashburn, Virginia. Ashburn’s significance is the result of several factors, including northern Virginia’s historic role in the creation of the Internet and its predecessor, the ARPANET. The Cloud servers located there also exist because of the availability of sources of electric power, including a grid of power lines connected to wind turbines, gas-, and coal-fired plants located to its west—a “networking” of a different type but just as important.
In In his recent book, Making IT Work, Jeffrey Yost quotes a line from the famous Joni Mitchell song, “Clouds”: “I really don’t know clouds at all.” He also quotes the Rolling Stones’ hit, “[Hey, you,] Get off my Cloud.” Why should a business or government agency trust its valuable data to a third-party, whose cloud servers are little understood? No thank you, said the Rolling Stones; not until you can explain to me just what the Cloud is and where it is. Yost gives an excellent account of how cloud servers have come to the fore in current computing. Yet Joni Mitchell’s words still ring true. Do we really know what constitutes the “Cloud”?
A common definition of the Cloud is that of sets of high-capacity servers, scattered across the globe, using high-speed fiber to connect the data stored therein to computing installations. These servers supply data and programs to a range of users, from mission-critical business customers to teenagers sharing photos on their smartphones. What about that definition is cloud-like? Our imperfect understanding of the term is related to the misunderstanding of similar terms also in common use. One is “cyberspace,” the popularity of which is attributed to the Science Fiction author William Gibson, from his novel Neuromancer, published in 1984. Another is “The Matrix”: the title of a path-breaking book on networking by John Quarterman, published in 1990 at the dawn of the networked age. The term came into common use after the award-winning 1999 Warner Brothers film starring Keanu Reeves. (Quarterman was flattered that Hollywood used the term, but he is not sure whether the producers of the film knew of his book.) In the early 1970s, Robert Metcalfe, David Boggs, and colleagues at the Xerox Palo Alto Research Center developed a local area networking system they called “Ethernet”: suggesting the “luminiferous aether” that was once believed to carry light through the cosmos.
These terms suggest an entity divorced from physical objects—pure software independent of underlying hardware. They imply that one may dismiss the hardware component as a given, just as we assume that fresh, drinkable water comes out of the tap when we are thirsty. The residents of Flint, Michigan know that assuming a robust water and sewerage infrastructure is hardly a given, and Nathan Ensmenger has reminded us that the “Cloud” requires a large investment in hardware, including banks of disk drives, air conditioning, fiber connections to the Internet, and above all, a supply of electricity. Yet the perception persists that the cloud, like cyberspace, is out there in the “ether.”
Most readers of this journal know the physical infrastructure that sustains Ethernet, Cyberspace, and the Cloud. I will go a step further: not only does the Cloud have a physical presence, but it also has a specific location on the globe: Ashburn, Virginia.
A map prepared by the Union Army in 1862 of Northern Virginia shows the village of Farmwell, and nearby Farmwell Station on the Alexandria, Loudoun, and Hampshire railroad. The town later changed its name to Ashburn, and it lies just to the north of Washington Dulles International Airport. In the early 2000s, as I was preparing my study of high technology in northern Virginia, Ashburn was still a farming community. Farmwell Station was by the year 2000 a modest center of Ashburn: a collection of buildings centered on a general store. The railroad had been abandoned in 1968 and was now the Washington and Old Dominion rail-trail, one of the most popular and heavily traveled rails-to-trails conversions in the country. Thirsty hikers and cyclists could get refreshment at the general store, which had also served neighboring farmers with equipment and supplies.
Cycling along the trail west of Route 28 in 2020, one saw a series of enormous low buildings, each larger than the size of a football field, and surrounded by a mad frenzy of construction, with heavy equipment trucks chewing up the local roads. Overhead was a tangle of high-tension electrical transmission towers, with large substations along the way distributing the power. The frenzy of construction suggested what it was like to have been in Virginia City, Nevada, after the discovery and extraction of the Comstock Lode silver. The buildings themselves had few or no markings on them, but a Google search revealed that one of the main tenants was Equinix, a company that specializes in networking. The tenants of the servers try to avoid publicity, but the local chambers of commerce, politicians, and real estate developers are proud to showcase the economic dynamo of the region. A piece on the local radio station WTOP on November 17, 2020, announced that “Equinix further expands its big Ashburn data center campus,” quoting a company spokesperson saying that “…its Ashburn campus is the densest interconnection hub in the United States.” An earlier broadcast on WTOP reported on the activities of a local real estate developer, that “Northern Virginia remains the ‘King of the Cloud’” In addition to Equinix, the report mentioned several other tenants, including Verizon and Amazon Web Services.
These news accounts are more than just hyperbole from local boosters. Other evidence that indicates that, although cloud servers are scattered across the globe, Ashburn is indeed the navel of the Internet.
In my 2008 study of Tysons Corner, Virginia, I mentioned several factors that led to the rise of what I then called “Internet Alley.” One was the development of ARPANET at the Pentagon, and later at a DARPA office on Wilson Blvd. in Rosslyn. Another was the rise of the proto-Internet company AOL, headquartered in Tysons Corner. Also, in Tysons Corner was the location of “MAE-East”—a network hub that carried a majority of Internet traffic in its early days. The root servers of the dot.com and dot.org registry were once located in the region, with the a: root server in Herndon, later moved to Loudoun County. The region thus had a skilled workforce of network-savvy electrical and computer engineers, plus local firms such as SAIC and Booz-Allen who supported networking as it evolved from its early incarnations.
Around the year 2000, while many were relieved that the “Y2K” bug had little effect on mainframe computers, the dot.com frenzy collapsed. The AOL-Time Warner merger was a mistake. But there was an upside to the boom-and-bust. In the late 19th and early 20th Century the nation experienced a similar boom and bust of railroad construction. Railroads went bankrupt and people lost fortunes, But the activity left behind a robust, if overbuilt, network of railroads that served the nation well during the mid and late 20th century. During the dot.com frenzy, small firms like Metropolitan Fiber dug up many of the roads and streets of Fairfax and Loudoun Counties and laid fiber optic cables, which offered high speed Internet connections. After the bust these became unused— “dark fiber” as it was called. Here was the basis for establishing Cloud servers in Ashburn. By 2010, little land was available in Tysons Corner, Herndon, or Reston, but a little further out along the W&OD rail-trail was plenty of available land.
That leaves the other critical factor in establishing Cloud servers—the availability of electric power. While some Cloud servers are located near sources of wind, solar, or hydroelectric power, such as in the Pacific Northwest, Northern Virginia has few of those resources. The nearest large-scale hydroelectric plant, at the Conowingo Dam, lies about 70 miles to the north, but its power primarily serves the Philadelphia region. (That plant was the focus of the classic work on electric power grids, Networks of Power, by Thomas Parke Hughes.) To answer the question of the sources of power for Ashburn, we return to the Civil War map and its depiction of the Alexandria, Loudoun, and Hampshire, later known as the Washington and Old Dominion Railroad.
The origins of that line go back to the 1840s, when freight, especially coal, from the western counties of Virginia were being diverted to Baltimore, Maryland over the Baltimore and Ohio Railroad. In response, Virginians chartered a route west over the Blue Ridge to the mineral and timber rich areas of Hampshire County. (After 1866 Hampshire County was renamed Mineral County, in the new state of West Virginia.) The Civil War interrupted progress in construction, and after several challenges to its financial structure, the line was incorporated as the Washington and Old Dominion Railway Company in 1911. It never reached farther than the summit of the Blue Ridge, and the proposed route to the west would have had to cross rugged topography. The line could never have competed with the B&O’s water level route. The shortened line soldiered on, until finally being abandoned in 1968, making way for the rail-trail conversion. One interesting exception was a short spur in Alexandria, which carried coal to a power plant on the shore of the Potomac. That plant was decommissioned in 2014, thus ending the rail era of the Alexandria, Loudoun, and Hampshire.
Charles Babbage’s Ninth Bridgewater Treatise
Margaret Dykens, MLIS, MS, Curator and Director of the Research Library San Diego Natural History Museum
With preface by Amanda Wick, Interim Archivist, Charles Babbage Institute Archives
Who was Charles Babbage?
Charles Babbage, Victorian scientist and mathematician, was born on December 26, 1791 to a family of London bankers. Fascinated with mathematics, and especially algebra, he studied the subject at Trinity College, Cambridge. While attending Cambridge, he co-founded the Analytical Society for promoting continental mathematics and reforming traditional teaching methodologies of the time. Many of these methods are still used in some form today in the instruction of algebra.
Following completion of his degree, Babbage worked as a mathematician for the insurance industry. He was elected a Fellow of the Royal Society in 1816 and played a prominent part in the foundation of the Astronomical Society (later Royal Astronomical Society) in 1820. As a member of the Royal Society during the heady days of the early 1800s, Babbage came into contact with a number of great thinkers and engaged in a robust correspondence with fellow mathematicians, naturalists, and philosophers—including Sir William Herschel, Charles Darwin, and Ada Lovelace.
In 1821 Babbage invented the first of his two calculating machines, The Difference Engine, which would quickly become his singular passion and focus. The function of the Difference Engine was intended to compile mathematical tables and, on completing it in 1832, he began work on a more complex and multifunctional machine that could perform any kind of calculation. This was the Analytical Engine (1856) and its invention is widely considered to be the founding of the field of modern computing.
Today, little remains of Babbage's prototype computing machines and, unfortunately, critical tolerances required by his machines exceeded the level of technology available at the time. Though Babbage’s work was formally recognized by respected scientific institutions, the British government suspended funding for his Difference Engine in 1832, and after an agonizing waiting period, ended the project in 1842. Though Babbage's work was continued by his son, Henry Prevost Babbage, after his death in 1871, the Analytical Engine was never successfully completed, and ran only a few "programs" with embarrassingly obvious errors.
Despite his many achievements in mathematics, scientific philosophy, and his leadership in contemporary social movements, Babbage’s failure to construct his calculating machines left him a disappointed and embittered man. He died at his home in London on October 18, 1871.
What’s in a name?
The calculating engines of English mathematician Charles Babbage (1791-1871) are among the most celebrated icons in the prehistory of computing. Babbage’s Difference Engine No. 1 was the first successful automatic calculator and remains one of the finest examples of precision engineering of the time. Babbage is sometimes referred to as "father of computing." The International Charles Babbage Society (later the Charles Babbage Institute) took his name to honor his intellectual contributions and their relation to modern computers.
Where is Babbage in the Archives?
Materials related to Charles Babbage are scattered around the world, with the vast majority of his personal papers and library held at the Science Museum of London and the British National Library. Although the Charles Babbage Institute is named after Charles Babbage, we actually have very little material originating with our namesake. What we do have are first editions of many of his books and journal articles and a number of these are inscribed with dedications to his patrons by the author. These rare materials constitute the earliest materials in our repository and, while used in classroom settings and on exhibit, rarely leave our vault. Our holdings of Babbage’s work include the following:
- Babbage, Charles. On a Method of Expressing by Signs the Action of Machinery. London: [Royal Society of London], 1826.
- Babbage, Charles. Reflections on the Decline of Science in England, and on Some of Its Causes. London: Printed for B. Fellowes (Ludgate Street); and J. Booth (Duke Street, Portland Place), 1830.
- Babbage, Flather, Dodgson, Flather, John Joseph, and Dodgson, Charles. On the Economy of Machinery and Manufactures. London: C. Knight, 1832.
- Babbage, Charles. Passages from the Life of a Philosopher. London: Longman, Green, Longman, Roberts, & Green, 1864.
- Babbage, Charles. The Ninth Bridgewater Treatise: A Fragment. Second Edition., Reprinted. ed. Cass Library of Science Classics; No. 6. London: Cass, 1967.
What is the Ninth Bridgewater Treatise?
One of the titles in Babbage’s oeuvre that is uniquely significant is the Ninth Treatise of Bridgewater. This volume presents Babbage’s perspective on the Eight Treatises of Bridgewater—a series of work by multiple influential thinkers of the Victorian era on natural history, philosophy, and theology. Babbage’s contribution is not officially affiliated with the eight-volume series and was merely his own considerations on the topic. In his volume, which he titled the Ninth Bridgewater Treatise, he discusses his calculating machines and posits the idea of God as a divine programmer who established the rigid natural laws which govern humanity and civilization, in many it presents a case for Deux et Machina.
As a fragmentary piece, and one that does not dwell on mathematical or scientific subjects, this is rarity amongst Babbage materials. Our copy is a second edition and, while in excellent condition, it is not especially rare. Recently, the Curator and Director of the Research Center at the San Diego Natural History Museum, Margaret Dykens, experienced one of those once-in-a-lifetime finds when she reviewed an anomaly within their catalog, an edition of Babbage’s Ninth Treatise of Bridgewater that seemed to be a galley proof. As she notes in the following article, deep examination of the item by both herself and noted Babbage scholar, Dr. Doron Swade, made several incredible finds.
Charles Babbage Institute. (10 June 2020). “About Charles Babbage.” Charles Babbage Institute web site. http://www.cbi.umn.edu/about/babbage.html
Swade, Doron. (12 June 2020). "Babbage, Charles (1791–1871), mathematician and computer pioneer." Oxford Dictionary of National Biography. 23 September 2004. https://www.oxforddnb.com/view/10.1093/ref:odnb/9780198614128.001.0001/odnb-9780198614128-e-962.
Amanda Wick (July 2020). “Charles Babbage’s Ninth Bridgewater Treatise.” Interfaces: Essays and Reviews in Computing and Culture Vol. 1, Charles Babbage Institute, University of Minnesota, 17-22.
About the author: Amanda Wick is the interim archivist at the Charles Babbage Institute Archives (CBIA) at the University of Minnesota. Prior to working at CBIA, Amanda led major processing projects at the University of Minnesota and managed the archives of the Theatre Historical Society. She obtained her Bachelor’s degree in Environmental Studies from Lawrence University (Appleton, WI) and her Masters in Library and Information Science from Dominican University (River Forest, IL).
Charles Babbage’s Ninth Bridgewater Treatise in the SDNHM Library
Margaret Dykens, MLIS, MS, Curator and Director of the Research Library San Diego Natural History Museum
Abstract: As a foundational figure in the history of science, Charles Babbage is best known for his contributions to computing. In fact, his mechanical, programmable calculating machines are considered precursors to modern computers. These accomplishments were the primary reason for the naming of the Charles Babbage Institute, and its archivists have sought to honor its namesake through the purchase of rare books authored and inscribed by him. One such book is a fragmentary oddity, the Ninth Bridgewater Treatise, and a copy owned by the San Diego Natural History Museum that was recently examined by curatorial staff and prominent Babbage scholar, Dr. Doron Swade, holds curious clues to Babbage's approach to natural philosophy. (KW: Babbage, Charles; Swade, Doron; computing history; rare books; antiquities; archives.)
The Research Library of the San Diego Natural History Museum (SDNHM), founded in 1874, has extensive holdings of rare and antiquarian books, including natural history volumes dating back to 1514. The majority of these books were donated by various naturalists and philanthropists over the past one hundred years. One such naturalist was General Anthony Wayne Vogdes (1843-1923), a career Army officer with an active secondary career as a geologist and paleontologist. Vogdes was also an avid bibliophile and donated his extensive scientific library to the SDNHM after his death in 1923. One of the books from Vogdes’ library was a first edition of Charles Babbage’s Ninth Bridgewater Treatise (1837).
This particular volume was mentioned in a newspaper article published on January 11, 1896 in the San Francisco Bulletin, which described many of the most important books in Vogdes’ personal library. Babbage’s Ninth Bridgewater Treatise is mentioned in the list with the comment that it contained “annotations by the author.” The book in question appears to be a galley proof with wide margins and many hand-written pencil annotations, as well as marginalia likely written by the author.
There is also a portion of a hand-written letter bound into the book itself—Vogdes was an amateur book-binder and his library consists almost exclusively of his own bindings, many of which have notes, letters, images, or other memorabilia that he collected and bound into the text.
I was intrigued by the hand-written annotations and marginalia in Vogdes’ copy of the Ninth Bridgewater Treatise and contacted Dr. Doron Swade, preeminent Babbage scholar and retired curator of the Charles Babbage collection at the Science Museum of London, for verification of the handwriting. After emailing Dr. Swade several images of the annotations, he replied to me that it was highly likely that they were in Charles Babbage’s own hand, both because of the style of writing as well as the content itself. To quote Dr. Swade:
Having gone through the 7,000 manuscript sheet (ms) of Babbage Scribbling Books the handwriting in what is visible on the folded manuscripts interleaved on page 128, and in the third image, looks very much like Babbage’s, as do the pencilled annotations.
But there is stronger evidence for the annotations and ms being his: in the preface ‘advertisement’ to the second edition Babbage states that the chapter ‘On Hume’s Argument Against Miracles’ has been ‘nearly rewritten’. The first image you sent with the pencilled annotations, which are surely from the first edition, correspond to changes made in the second edition. It is not credible that anyone other than Babbage would have made what are essentially editorial instructions, and editorial amendments, that were carried through to the second edition.
There is even more conclusive evidence in the sample page 131 where the pencilled annotations appear verbatim in the second edition, and the several pencilled deletions have also been carried through.
The ms in the third of the images you sent starts with the same opening sentence that appears in the second edition at the top of page 127 though what follows has been edited and amended. It could be that this is a sheet from the original manuscript for the first edition though not having access to a first edition I am unable to confirm this.
It is fair to conclude that the annotations are Babbage’s. It is difficult to see any other explanation.
Although I do not know how General Vogdes came to have this particular annotated first edition of the Ninth Bridgewater Treatise in his collection, I am not surprised as his entire library constituted over seven thousand scientific volumes on topics related to geology, paleontology, and other scientific and philosophical disciplines. Indeed, his personal library included works by Darwin, Hume, Dana, Agassiz, and Lyell as well as many other well-known natural historians and intellectuals.
We are hopeful that this unique source might be of interest to some Babbage researcher or historian. Any scholars interested in pursuing this topic further should feel free to contact me directly at the SDNHM Research Library.
Swade, Doron, Dr. “’Ninth Bridgewater Treatise.’ Message requesting assistance in authenticating possible rare volume by Charles Babbage.” Message to Margaret N. Dykens. January 2020. E-mail.
Margaret N. Dykens (July 2020). “Charles Babbage’s Ninth Bridgewater Treatise in the SDNHM Library.” Interfaces: Essays and Reviews in Computing and Culture Vol. 1, Charles Babbage Institute, University of Minnesota, 17-22.
About the author: Margaret N. Dykens received her Master’s degree in Biology from the College of William and Mary, Williamsburg, Virginia in 1980. Upon graduation, she was hired as Staff Illustrator at the Harvard University Herbarium. Margaret went on to earn a second graduate degree in Library Science from the University of Michigan School of Information in 1993. In 1997, she became the Director of the Research Library for the San Diego Natural History Museum (SDNHM). In addition to her work directing SDNHM, she has served as curator for two exhibitions; the first was The California Legacy of A.R. Valentien, based on the Museum’s fine art collection, where she toured with this exhibition to numerous venues across the U.S. In 2016, she also curated the permanent exhibition, Extraordinary Ideas from Ordinary People: A History of Citizen Science, based on fine art works, historical objects, and rare books from the Research Library.
Of Bugs, Languages and Business Models: A History
Alejandro Ramirez, PhD, Sprott School of Business – Carleton University
Abstract: A series of wrong decisions precipitated the Y2K crisis: adopting the 6-digit date format, using COBOL as the standard in business computing and discontinuing COBOL-teaching in many American universities shortly after it was adopted. Did we learn anything from this crisis? (KW: Y2K crisis, COBOL, Internet history, Outsourcing.)
Twenty years ago, we averted the Y2K crisis. When we talk about the crisis, people are genuinely puzzled that it was a very expensive affair. They have a distorted idea about a crisis that did not happen, or how it was supposed to be the end of the world, but at the end, nothing actually happened. Then they wonder if something similar could happen again. That is really the crux of the matter: what did we learn from the Y2K crisis?
Knowing the history of this crisis is an important and serious endeavour. It is a benefit to understand how computer usage evolved, and what forces shaped our technology, our practices, and computers’ contribution to society. History becomes an indispensable light guiding us in this understanding.
What were they thinking?
Employment of personnel to use computers in businesses became widespread in North America with the introduction of the IBM 1401 in 1959. Before, if any, machine-based data processing generally was executed by electromechanical accounting machines. Calendar days, if needed, were fed via punched cards, indicating the date appropriate for that job. When programmers from the late 1950s to the mid-1960s decided that in order to save on memory costs (McCallum 2019), they will use only the last two digits of the year, i.e., 60 instead of 1960, they never imaging that their programs will still be running at the end of the 20th Century. After all, 40 years seemed a very long time, especially since they were saving approximately $16.00 USD per date by saving two bytes, 16 bits, of core memory valued at about one dollar per bit.
When IBM announced their new, more powerful System/360, with many innovative features compared to their 1400s technology, they also decided—in the interest of compatibility—that their system’s date would also be a 6-digit date. To cement this practice, on November 1, 1968, the U.S. Department of Commerce, National Bureau of Standards, issued a Federal Information Processing Standard which specified the use of 6-digit dates for all information exchange among federal agencies (FIPS 1968). The standard became effective in January 1, 1970, enshrining the 6-digit date standard by the bureaucracy of government, also with little to no thought of the year 2000.
It took about fifteen years for someone to realize that having a 6-digit date may be a problem. Unknown to most but a few programmers, Jerome and Marylin Murray published their call to arms Computers in Crisis: How to Advert the Coming Worldwide Computer Systems Collapse in 1984. They credited their daughter Rosanne, a senior research analyst at Systemhouse, Ltd., of Ottawa for the origins of the book: “This book may not have been undertaken were it not for a lengthy telephone discussion of the dating problem with Rosanne…Her interest and encouragement have been unflagging” (Murray & Murray 1984, p. xix).
Shortly after the book was published, Spencer Bolles posted on January 18, 1985, from his computer in Reed College in Oregon, the first recorded mention of the Year 2000 problem on a Usenet group: “I have a friend that raised an interesting question that I immediately tried to prove wrong. He is a programmer and has this notion that when we reach the year 2000, computers will not accept the new date” (Bolles 1985).
In March 1959 Burroughs Corporation computer scientist Mary Hawes called for an industry and government consortium in order to develop a standard programming language for business—promoting greater portability for organizational users transitioning mainframe computers. With appearances of Autocode, FLOW-MATIC, FORTRAN, ALGOL-58, and other 1950s programming languages, she recognized the high costs of proliferation.
Feeding the Beast
From the 1960s into the 1990s, many universities offered COBOL courses, as did companies and vocational schools like Control Data Institutes. Today, in an age where AI/analytics, games, robotics, cloud, and the internet of things are foremost for many computer science students, few consider learning legacy systems and legacy languages. Accordingly, COBOL courses are scarce. A Slate article quoted Prof. John Zeanchock, Robert Morris University, stating just 37 colleges and universities globally have a “mainframe course” on the curriculum. Most schools’ faculty are unable to suggest legacy specialist students/graduates when banks or local governments call. (Botella 2020). In our culture, Innovation is revered, and maintenance is not. In IT there is a myopic attention to the latest tech and a failure to recognize and value that IT maintenance requires great skill and can be innovative (new processes, new fixes, etc.). Privileging innovation over maintenance is also in part tied to gender stereotypes and discrimination as historically women have had greater opportunity in the critical areas of services, maintenance (both machines and debugging), and programming (from plug board to languages), and fewer opportunities in computer and software engineering (Yost, 2011, 2017).
The percentage of women majors in computer science declined sharply the past quarter century—from more than 35 percent in the 1980s to 18.1 percent in 2014, only varying slightly since (nsf.gov/statistics). The reasons are varied, but gender stereotyping, a male dominant computing culture, and educational and workplace discrimination are factors (Abbate, 2012; Hicks, 2017; Misa, 2011). This has furthered labor shortages (all areas, including legacy) and held back computer science. Labor shortages can become all the more profound in times of crisis, including the current health and economic crisis.
More than a Jersey Thing
On April 6, 2020, New Jersey Governor Phil Murphy made a public plea for volunteer “Cobalt” programmers (meaning COBOL) to aid New Jersey and help with glitches to an overburdened unemployment benefits computer system more than 40 years old. New Jersey was having difficulties with timely processing of unemployment payments to the flood of new filers. The increased burden (volume and parameters) on the unemployment system was a major bottleneck, or to borrow Thomas Hughes’ term, reverse salient, to timely and accurate data processing to respond to those in need (Hughes, 1983).
This sparked an onslaught of journalist articles as well many Twitter, Facebook and other social media posts. The critiques ranged from Governor Murphy/New Jersey having an antiquated unemployment insurance computer system to calling for volunteers from a population segment that would undoubtedly be the most susceptible to COVID-19 risk—the elderly. Meanwhile, social media erupted with jokes with ageist images of elderly individuals as potential volunteers.
Other states, including Connecticut and Kansas, had similar shortages of trained COBOL experts to confront unemployment insurance system challenges. Understandably, unemployed workers waiting for unemployment benefits are extremely frustrated and angry, expressing as much on the Kansas Department of Labor (KDoL) platform. Much is the matter with Kansas’ system, with its origins in the 1970s, and inadequate updates for flexibility and scale. In late April, KDoL indicated a timeline where processing could occur by late May (for many that will push the wait to months). For states that have prioritized investing in updating other computer systems, but not unemployment insurance, it amounts to neglecting infrastructure that serves the most vulnerable in society.
Why do so many states have ill-equipped IT systems for unemployment benefits processing? Replacing long existing systems is complex and expensive (hundreds of millions of dollars). Change is also disruptive to existing labor, existing skill sets. Unemployment systems serve those lacking political power; federal and state governments deprioritize them. Further, systems (in all their technical, political, economic, and other contexts) become entrenched, or to use Hughes’ concept, gain momentum (Hughes, 1983). Failures/pressures can redirect momentum, some states scrambled for cloud solutions once systems crashed in April—possibly the least bad option, but also suboptimal timing, new systems and processes on the fly are especially difficult. Regardless, the problem is one of infrastructure—not valuing maintenance, labor, and recipients. It is not merely COBOL versus the cloud, in fact, COBOL can and does integrate with AWS, Azure, and IBM clouds, hybrid cloud is common.
State IT Workers and Hired Guns’ Heroic Efforts
North Texas’ COBOL Cowboys staffing firm, larger IT services enterprises, and COBOL-skilled independent contractors are in great demand. The governors, state DoLs, and state CIOs are doing their best to staff up to address problems. For the systems analysts, programmers, and other state employees and contractors the hours are long, work difficult, and efforts truly heroic. The Federal CARES Act’s unemployment benefits, PUA/PECU, allows states to extend the duration of benefits, and include those usually not eligible—the self-employed. This adds greatly to both volume and complexity. In my playful title, “play” is used for where work plays/is performed (fewer coders choosing legacy) and to highlight coders’ creativity—in the spirit of CS metaphors like “sandbox” for building (non-live) code.
Domestic and Global Digital Divides
In the coming year, the overall percentage of Americans below the poverty line will peak higher than any time in more than 50 years— the impact for African-American, Hispanic, and Native-American populations is particularly severe. The disparity of access to health insurance, banking, loans, and information technology, as well as exposure to risk, and disparity of incidence and mortality with COVID-19, highlights extreme and growing race and class inequality in the United States.
Washington D.C.’s unemployment platform urges benefits filers to use Microsoft Explorer. Microsoft retired Explorer in January 2016, an unsupported version remains for computers, not smart phones. A Pew Research Center 2019 survey showed 54 percent of Americans under $30,000/year income have a computer, while 71 percent have a smart phone. For those making over $100,000, 94 percent have a computer/broadband at home. (Anderson and Kumar, 2019). Only 58 percent of African-Americans have a computer, versus 82 percent for whites. (Perrin and Turner, 2019) In digital, just like education, healthcare, housing, and other infrastructure, there are two Americas.
Y2K: Why to Care
An earlier crisis largely involving COBOL, one with a long and visible runway, is both consequential context and instructive to current challenges. About a quarter century ago, governments and corporations began seriously addressing the pending Y2K crisis—caused by two digits for date often in COBOL code—to avert risks to life and the economy, to make it a nonevent.
Investments and global cooperation were key and the International Y2K Cooperation Center played a meaningful role in fostering collaboration. The shortage of programmers knowledgeable in COBOL, and the lower expense and overwhelming volume of code, led to outsourcing to an emerging Indian IT services industry. This lent momentum to this trade, and to a shifting geography in IT work that remains impactful (though corporate decision-makers are accelerating artificial intelligence applications producing further labor transformations, ones detrimental to Indian IT laborers, developments standout ABD sociologist and CBI IDF Fellow Devika Narayan is insightfully analyzing). Gartner Inc. estimated U.S. government and business expenditures were up to $225 billion, a breathtaking sum indicative of costs of putting off maintenance until a time-sensitive crisis. In passing into the new millennium with few major problems, the situation lent credence to two diverging interpretations—that heavy investment in maintenance had been necessary to avert catastrophe, or more common (and less accurate), that it was an overhyped problem leading to squandered funds in preparing, in maintenances fixes. Offshoring saved money in the short run, but may not have in the longer run, it left a legacy of less and less current, on-shore COBOL expertise (for maintenance, updates, security, etc.), a workforce and talent helpful in global crises, particularly ones in which unfortunate (U.S.) nationalistic tendencies and policies have inhibited international cooperation.
CONNECT and Disconnects
Maintaining infrastructure is important. Anemic IT budgets have not only hurt opportunities to change and move to innovative new solutions, but also to best maintain existing systems and to better assure their ability to perform and to perform to scale in both normal times and crises. The reverse salient certainly is not always COBOL or COBOL alone. State auditors warned Florida Governor Ron DeSantis that Florida’s unemployment site, its “CONNECT” cyberinfrastructure, had more than 600 systems errors in need of fixing, but that state officials had “no process to evaluate and fix.” (Mower, 2020). It was a 2013 $77 million system, which he is quick to point out, his administration inherited. This underlines the challenge not just in Florida but many States—inadequate infrastructure is the predecessors’ fault, is not the current leaders’ problem, and fixes should lie with successors. Often the (now) multi-hundred million-dollar cost typical of major upgrades to new unemployment insurance systems (and ongoing refinement) is difficult without federal assistance. Florida’s CONNECT is a reminder of damaging disconnects, and leaders’ inattention to infrastructure for vulnerable people. The problem is also one of meager and dwindling federal support. Federal aid for state unemployment administration has been dropping for a quarter century with severe cuts in 2018 and 2019. In a survey (pre-COVID-19) more than half of the states responded their unemployment system problems were “serious” or “critical.” (Botella 2020).
Neglected Infrastructure and Crashes
Working two tenths of a mile from the site of the 2007 Interstate 35 West Mississippi River Bridge collapse in Minneapolis, is a frequent reminder that strong, safe, and well-maintained infrastructure is essential. Twenty-eight percent of infrastructure project funding at the state level comes from federal grants (primarily for physical infrastructure). States’ invisible software infrastructure is starved, especially unemployment systems. Hopefully the COVID-19 pandemic leads not only to evaluating our medical preparedness with ICUs, PPE, and unmet needs in free enterprise insurance and healthcare, but also greater evaluation of IT infrastructures. Ideally, the developments will lead all governors with poor unemployment insurance system performance to the same conclusion as Governor Murphy about the need for post-mortems on digital infrastructure. As he put it “how the heck did we get here when we literally needed COBOL programmers”— learning from the past is important.
One thing clear from the two COBOL crises is that history and archives matter—my thoughts here have at best just scratched the surface on fundamental IT infrastructure and contexts someone could analyze with tremendous depth using Charles Babbage Institute resources. CBI’s archival and oral history resources (most transcripts online, all free) to study the Y2K crisis and the history of CODASYL and COBOL (and many other topics and themes in the history and social study of computing) are the finest and most extensive in the world. A talented University of Pennsylvania doctoral candidate in the History and Sociology of Science, Zachary Loeb, has drawn on CBI’s International Y2K International Cooperation Center Records for his important dissertation on the cultural, political, and technical history of Y2K.
Over the years, a number of researchers have used our Conference on Data Systems Languages (CODASYL) Records. While it stands out on documenting COBOL and the group’s work with databases (what occurred in 1959 and far beyond), we have many other COBOL materials in a variety of collections. One such collection (a recent one) is our largest overall collection at more than 500 linear feet, the Jean Sammet Papers—Sammet may have been the single most important developer with COBOL. Likewise, our Frances E. (“Betty”) Holberton Papers has rich material on CODASYL and COBOL. There is also great COBOL content in our Burroughs Corporate Records, Control Data Corporation Records, Gartner Group Records, Auerbach and Associates Market and Product Reports, IBM SHARE, Inc., HOPL 1978, Charles Phillips Papers, Jerome Garfunkel Papers, Warren G. Simmons Papers, National Bureau of Standards Computer Literature, Computer Manuals, and many other collections. COBOL’s history is one of government, industry, and intermediaries’ partnerships, standards, maintenance, labor, gender, politics, culture and much more. In a technical area that always seems focused on the new, new thing, its 60-year past and its continuing presence deserve greater study.
Abbate, Janet. (2012). Recoding Gender: Women’s Changing Participation in Computing, MIT.
Allyn, Bobby. (2020). “COBOL Cowboys Aim to Rescue the Sluggish State Unemployment Systems." NPR, April 22, 2020.
Anderson, Monica and Madhumitha Kumar. (2020). “Digital Divide Persists…” Pew Research Center, May 7, 2020.
Botella, Ella. (2020). “Why New Jersey’s Unemployment System Uses a 60-Year-Old Programming Language.” Slate, April 9, 2020.
Charles Babbage Institute Archives (finding aids to the collections mentioned in final paragraph).
Hicks, Marie. (2017). Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing, MIT Press.
Hughes, Thomas P. (1983). Networks of Power: Electrification in Western Society, 1880 to 1930, Johns Hopkins University Press.
Kennelly, Denis. (2019) “Three Reasons Companies are only 20% Into Cloud Transformation.” IBM.com, March 5, 2019.
King, Ian. (2020). “An Ancient Computer System is Slowing Giant Stimulus.” Bloomberg.com, April 13, 2020.
Mazmanian, Adam. (2014). “DoD Plans Upgrade to COBOL-based Contract System” FCW, July 7, 2014.
Misa, Thomas J., ed. (2011). Gender Codes: Why Women are Leaving Computing, Wiley.
Mower, Lawrence. (2020). “Ron DeSantis…” Tampa Bay Times, March 31, 2020.
Perrin, Andrew and Erika Turner. (2019) “Smartphones Help Blacks and Hispanics Bridge Some—But Not All—Digital Gaps with Whites,” Pew Research Center, August 20, 2019.
Yost, Jeffrey R. (2011). “Programming Enterprise: Women Entrepreneurs in Software and Computer Services,” in Misa, ed. [full cite above].
Yost, Jeffrey R. (2017). Making IT Work: A History of the Computer Services Industry, MIT Press.
Special thanks to CBI Acting Archivist Amanda Wick for discussion/insights on COBOL and our collections.
Jeffrey R. Yost (May 2020). “Where Dinosaurs Roam and Programmers Play: Reflections on Infrastructure, Maintenance, and Inequality.” Interfaces: Essays and Reviews on Computing and Culture Vol. 1, Charles Babbage Institute, University of Minnesota, 1 - 8.
About the author: Jeffrey R. Yost is CBI Director and HSTM Research Professor at the University of Minnesota. He has published six books (and dozens of articles), most recently Making IT Work: A History of the Computer Services Industry (MIT Press, 2017) and FastLane: Managing Science in the Internet World (Johns Hopkins U. Press, 2016) [co-authored with Thomas J. Misa]. He is a past EIC of IEEE Annals of the History of Computing, and current Series Co-Editor [with Gerard Alberts] of Springer’s History of Computing Book Series. He has been a principal investigator of a half dozen federally sponsored projects (NSF and DOE) on computing/software history totaling more than $2 million. He is Co-Editor [with Amanda Wick] of Interfaces: Essays and Reviews in Computing & Culture.