Interfaces

Essays and Reviews in Computing and Culture 

Kids at computer

Interfaces publishes short essay articles and essay reviews connecting the history of computing/IT studies with contemporary social, cultural, political, economic, or environmental issues. It seeks to be an interface between disciplines, and between academics and broader audiences. 

Co-Editors-in-Chief: Jeffrey R. Yost and Amanda Wick

Managing Editor: Melissa J. Dargay

2021 (Vol. 2) Articles

 

FWB Core Team Member @JoseRMeijia on Twitter: “This is the way” (2022).
FWB Core Team Member @JoseRMeijia on Twitter: “This is the way” (2022).

Introduction

If I was part of any DAO, I would want it to be “Friends With Benefits.” It is just so darn cool. As a vortex of creative energy and cultural innovation, the purpose of its existence seems to be to have fun. FWB is a curated Decentralized Autonomous Organization (DAO) that meets in the chat application ‘Discord’, filled with DJs, artists, and musicians. It has banging public distribution channels for writing, NFT art, and more. This DAO crosses from the digital realm to the physical via its member-only ticketed events around the world, including exclusive parties in Miami, Paris, and New York. The latest of these events was “FWB Fest,” a three-day festival in the forest outside of LA. It was being ‘in’ and ‘with’ the DAO at FEST that I realised that this DAO, like many others, hasn’t yet figured out decentralized governance.

On top of the fundamental infrastructure layer of public blockchain protocols exists the idea of “Decentralized Autonomous Organizations” (DAOs). Scholars define DAOs as a broad organizational framework that allows people to coordinate and self-govern, through rules deployed on a blockchain instead of issued by a central institution (Hassan & De Filippi, 2021; Nabben, 2021a). DAOs are novel institutional forms, that manifest for a variety of purposes and according to varying legal and organizational arrangements. This includes protocol DAOs that provide a foundational infrastructure layer, investment vehicles, service providers, social clubs, or a combination of these purposes (Brummer & Seira, 2022). The governance rules and processes of DAOs, as well as the degree to which they rely on technology and/or social processes, depends on the purpose, constitution, and members of a particular DAO. Governance in any decentralized system fundamentally relies on relationships between individuals in flat social structures, enabled through technologies that support connection and coordination without any central control (Mathew, 2016). Yet, as nascent institutional models, there are few formally established governance models for DAOs, and what does exist is a blend of technical standards, social norms, and experimental practices (significant attempts to develop in this direction include the ‘Gnosis Zodiac’ DAO tooling library and ‘DAOstar’ standard proposal (Gnosis, 2022; DAOstar, 2022)). DAOs are large-scale, distributed infrastructures. Thus, analogising DAO governance to Internet governance may provide models for online-offline stakeholder coordination, development, and scale.

The Internet offers just one example of a pattern for the development of large-scale, distributed, infrastructure development and governance. There exists a rich historical literature on the emergence of the Internet, the key players and technologies that enabled it to develop, and the social and cultural factors that influenced its design and use (Abbate, 2000; Mailland & Driscoll, 2017). Internet governance refers to policy and technical coordination issues related to the exchange of information over the Internet, in the public interest (DeNardis, 2013). It is the architecture of network components and global coordination amongst actors responsible for facilitating the ongoing stability and growth of this infrastructure (DeNardis & Raymond, 2013). The Internet is kept operational through coordination regarding standards, cybersecurity, and policy. As such, governance of the Internet provides a potential model for DAOs, as a distributed infrastructure with complex and evolving governance bodies and stakeholders.

The Internet is governed through a unique model known as ‘multi-stakeholder governance’. Multistakeholderism is an approach to the coordination of multiple stakeholders with diverse interests in the governance of the Internet. Multistakeholderism refers to policy processes that allow for the participation of the primary affected stakeholders or groups who represent different interests (Malcolm, 2008; 2015). The concept of multi-stakeholder governance is often associated with characteristics like “open”, “transparent”, and “bottom-up”, as well as “democratic” and “legitimate”. Scholar Jeremy Malcolm synthesizes these concepts into the following criteria:

1. Are the right stakeholders participating, referring to sufficient participants to present all the perspectives of all with a significant interest in any policy directed at a governance problem?

2. How is participation balanced refers to policy development processes designed to roughly balance the views of stakeholders, ahead of time, or by a deliberative democratic process in which the roles of stakeholders and the balancing of their views are more dynamic (but usually subject to a formal decision process)?

3. How is the body and its stakeholders accountable to each other for their roles refers to trust between host body and stakeholders, that the host body will take responsibility to fairly balance the perspectives of participants, and that stakeholders claim legitimate interest to contribute?

4. Is the body an empowered space refers to how closely stakeholder participation is linked to spaces in which mutual decisions are made, as opposed to spaces that are limited to discussion and do not lead to authoritative outcomes (2015)?

5. The fifth criterion, which I contribute in this piece is, is this governance ideal maintained over time?

In this essay, I employ a Science and Technology Studies lens and autoethnographic methods to investigate the creation and development of a “Decentralized Autonomous Organization” (DAO) provocatively named “Friends With Benefits” (FWB) in its historical, cultural, and social context. Autoethnography is a research method that uses personal experiences to describe and interpret cultural practices (Adams, et. al., 2017). This autoethnography took place online through digital ethnographic observation in the lead-up to the event and culminated at “FWB Fest”. Fest was a first of its kind multi-day “immersive conference and festival experience at the intersection of culture and Web3” hosted by FWB in an Arts Academy in the woods of Idyllwild, two hours out of LA (FWB, 2022a). In light of the governance tensions between peer-to-peer economic models and private capital funding that surfaced, I explore how the Internet governance criteria of multistakeholderism can apply to a DAO as a governance model for decentralized coordination among diverse stakeholders. This piece aims to offer a constructive contribution to exploring how DAO communities might more authentically represent their values in their own governance in this nascent and emerging field. I apply the criteria of multistakeholder governance to FWB DAO as a model for meaningful stakeholder inclusion in blockchain community governance. Expositing my experiences of FWB Fest reveals the need for decentralized governance models on which DAO communities can draw to scale their mission in line with their values.

A Digital City

FWB started as an experiment among friends in the creative industries who wanted to learn about crypto. The founder of the DAO is a hyper-connected LA music artist and entrepreneur named Trevor McFedries. While travelling around the world as a full-time band manager, McFredies used his time between gigs to locate Bitcoin ATMs and talk to weird Internet people. Trevor ran his own crypto experiment by “airdropping” a made-up cryptocurrency token to his influencer and community-building friends in tech, venture capital, and creative domains and soon, FWB took off. McFredies is not involved in the day-to-day operations of the DAO but showed up at FWB Fest and was “blown away” at the growth and progress of the project. The FWB team realized the DAO was becoming legitimate as more and more people wanted to join during the DAO wave of 2021-2022. This was compounded by COVID-19, as people found a sense of social connection and belonging by engaging in conversations in Discord channels amidst drawn-out lockdown and isolation. When those interested in joining extended beyond friends of friends, FWB launched an application process. Now, the DAO has nearly 6,000 members around the world and is preparing for its next phase of growth.

FWB’s vision is to equip cultural creators with the community and Web3 tools they need to gain agency over their production by making the concepts and tools of Web3 more accessible, building diverse spaces and experiences that creatively empower participants, and developing tools, artworks, and products that showcase Web3’s potential. The DAO meets online via the (Web2) chat application ‘Discord’. People can join various special interest channels, including fashion, music, art, NFTs, and so on. To become a member, one must fill out an application, pass an interview with one of the 20-30 rotating members of the FWB Host Committee, and then purchase 75 of the native $FWB token at market price (which has ranged over the past month from approximately $1,000 USD to $10,000 USD). Membership also provides access to a token-gated event app called “Gatekeeper”, an NFT gallery, a Web3-focused editorial outlet called “Works in Progress”, and in-person party and festival events. According to the community dashboard, the current treasury is $18.26M (FWB, n.d.).

“FWB vision” by Fiona Carty.
“FWB vision” by Fiona Carty.

It appeared to me that the libertarian origins of Bitcoin as a public, decentralized, peer-to-peer protocol had metamorphosised into people wanting to own their own networks in the creative industries. The DAO has already made significant progress towards this mission, with some members finding major success at the intersection of crypto and art. One example is Eric Hu, whose generative AI butterfly art “Monarch” raised $2.5 million in presale funds alone (Gottsegen, 2021). “The incumbents don’t get it” stated one member. “They want to build things that other people here have done but “make it better”. They never will”.

The story of how I got to FWB Fest is the same as everybody else’s. I got connected through a friend who told me about the FWB Discord. I was then invited to speak at FWB Fest based on a piece I wrote for CoinDesk on crypto and live action role-playing (LARPing) - referring to educational or political purposes role-playing games with the purpose of awakening or shaping thinking (Nabben, 2021b; FWB, 2022b). The guiding meme of Fest was “a digital city, turns into an offline town”. In many ways, FWB Fest embodied a LARP in cultural innovation, peer-to-peer economies, and decentralized self-governance.

The infrastructure of the digital city is decentralized governance. The DAO provides something for people to coalesce around. It serves as a nexus, larger than the personal connections of its founder, where intersectional connections of creativity collide in curated moments of serendipity. Membership upon application provides a trusted social fabric that brings accountability through reputation to facilitate connections, creativity, and business. In this tribal referral network, “it’s amazing the connections that have formed with certain people, and it’s only going to grow” states core-team member, Jose. Having pre-verified friends’ scales trust in a safe and accessible way. “Our culture is very soft” stated Dexter, a core team member during his talk with Glen Weyl on the richness of computational tools and social network data. It is a gentle way to learn about Web3, where peoples’ knowledge and experience are at all levels, questions are okay, and the main focus is shared creative interests with just a hint of Web3.

The next plan for the DAO, as I found out, is to take the lessons learned from FWB Fest and provide a blueprint for members to host their own FWB events around the world and scale the impact of the DAO. These localizations will be based on the example set by the DAO in how to run large-scale events, secure sponsors, manage participation using Web3 tools, carry the culture and mission of FWB, and garner more members. In the words of core team member Greg, the concept is based on urban planner Christopher Alexander’s work on pattern languages, as unique, repeatable actions that formulate a shared language for re-creation of a space that is alive and whole (Alexander, 1964). Localising the cultural connections and influence the DAO provides offers a new dimension in the scale and impact of the DAO, states core team member Alex Zhang. FWB is providing the products and tooling to enable this decentralization through localization. Provisioning tools like the Gatekeeper ticketing app (built by core team member Dexter, a musician and self-taught software developer) provide a pattern to enable community members to take ownership of running their own events by managing ticketing in the style and culture of FWB.

Multiple Stakeholders Governing the Digital City

It wasn’t until my final evening of the Fest that I realized that FWB itself had raised $10M in VC capital at a $100M valuation from some of the biggest names in US Venture Capital, including Andreessen Horowitz and a16z. In the press release, VC firm a16z states “FWB represents a new kind of DAO…it has become the de facto home of web3’s growing creative class (2021). The capital is intended to scale the “IRL” footprint of the DAO through local events around the world called “FWB Cities.” “Crypto offers a dramatically more incentive-aligned way for creatives to monetize their passions, but we also recognize that the adoption hurdles have remained significant. FWB serves as the port city of web3, introducing a culturally influential class to crypto by putting human capital first”.

The raise was controversial for the community according to the discussions that occurred, community calls, and sentiment afterwards (although, this was not reflected in the outcomes of the vote, which passed at 98%). Some see it as the financialization of creativity. “All this emphasis on ownership and value. And I feel like I’m contributing to it by being here!” stated one LARPer at FWB Fest, who runs an art gallery IRL. If the rhizomatic, self-replicating, decentralization thing can work, then we all need to own it together. “Right now, it’s still a fucking pyramid.”

Crypto communities are at risk of experiencing the corruption of the ideal of decentralization. This has already been a hard lesson in Internet governance – which has undergone a trajectory from the early Internet of the 1980s and settling on the TCP/IP standard protocol, to regional networks and the National Science Foundation (NSF) taking on the Internet as NSFnet in the 1980s and early 1990s, to privatization of the Internet under the Clinton Administration in the mid-1990s and sale of important elements to corporations such as Cisco Systems, to the rise of big tech giants with significant political influence and platform dominance by Microsoft, Google, Apple, and Facebook (Abbate, 2000; Tarnoff, 2022). Infrastructure is complex and fraught with the dynamics of power and authority (Winner, 1980). It is difficult to operate counter to the culture you come from without perpetuating it. If Web2 governance and capital allocation strategies are being perpetuated instead of new ones that facilitate the values of Web3, this has a direct effect on decentralized governance and community participation.

This DAO community, like many others, hasn’t yet figured out decentralized governance. For its next phase of growth and mission to empower its constituency, it has to. So far, the community remained successfully intact, or “unforked”. Yet, “progressive decentralization” through the localisation of events is not the same as meaningful empowerment to govern the organization. Any DAO's goal and incentives should not be to exit a start-up, especially not a social DAO. To quote one main stage speaker, Kelani from eatworks.xyz, “The artist's goal is to misuse technology. It’s a subversive outcome”. DAOs come from political origins and are about developing infrastructure to facilitate countercultural social movements (Nabben, 2022). In this case, to subvert existing capital models and create an innovation flywheel for peer-to-peer production in sustainable ways. In the domain of creativity, even failure equals progress and a “victory for art”.

The animating purpose of FWB DAO is to allow people to gain agency by creating new economies and propagating cultural influence. Yet, they have resorted to traditional venture capital models to bootstrap their business. However, the purpose of creating opportunities for new economic models must carry through each localisation, whilst somehow aligning members with the overarching DAO. The concept of multi-stakeholder governance offers a pattern for how to design for this.

FWB Core Team Member @JoseRMeijia on Twitter: “This is the way” (2022).
Source: FWB newsletter. July, 2022.

Applying the Criteria of Multi-stakeholder Governance to the Digital City

The principles that stakeholders adhere to in the governance of the Internet is one place to look for a historical example of how distributed groups govern the development and maintenance of distributed, large-scale infrastructure networks. Multistakeholderism acknowledges the duplicity of actors, interests, and political dynamics in the governance of large-scale infrastructures and the necessity of meaningful stakeholder engagement in governance across diverse groups and interests. This allows entities to transform controversies, such as the VC “treasury diversification” raise, into productive dialogue which positions stakeholders in subsequent decision-making for more democratic processes (Berker, et. al., 2011). In the next section of this essay, I apply the criteria of meaningful multi-stakeholder governance as articulated by Malcolm (2015) to FWB DAO, as a potential model in helping the DAO balance stakeholder interests and participation as it diversifies and scales.

 

  1. Are the right stakeholders participating?

The right stakeholders to be participating in FWB DAO include all perspectives with significant interest in creating DAO policies or solving DAO problems. This includes core team members employed by the DAO, long-term as well as newer members, and investors. This requires structural and procedural admission of those who self-identify as interested stakeholders (Malcolm, 2015).

 

  1. How is their participation balanced?

In the community calls where FWB members got to conduct Q&A with their newfound investors, the VCs indicated their intention to ‘delegate’ their votes to existing members, but to whom remains unclear. There must be mechanisms to balance the power of stakeholders to facilitate them reaching a consensus on policies that are in the public interest (Malcolm, 2015). FWB does not yet have this in place (to my knowledge, at the time of writing). This can be achieved through a number of avenues, including prior agreement of the unique roles, contributions, expertise, and resource control of certain stakeholders, or deliberative processes that flatten hierarchies by requiring stakeholders to defend their position in relation to the public interest (Malcolm, 2015). Some decentralized communities have also been experimenting with governance models and mechanisms that are relevant in evolving governance beyond ‘yes’ - ‘no’ voting. One example of this is the use of “Conviction Voting” to signal community preference over time and pass votes dynamically according to support thresholds (Zargham & Nabben, 2022).

 

  1. How are the body and its stakeholders accountable to each other for their roles in the process?

FWB DAO is accountable to its members for the authority it exercises as a brand and an organization. Similarly, through localised events, participants are accountable for legitimately representing the FWB brand, using its tools (such as the Gatekeeper ticketing app), and acquiring new members that pay their dues back to the overarching DAO. Mechanisms for accountability include if stakeholders accept the exercise of the authority of the host body, the host body operating transparently and according to organizational best practices, as well as stakeholders actively participating according to their roles and responsibilities (Malcolm, 2015).

 

  1. Is the body an empowered space?

For multistakeholder governance to ensue, the host body must meaningfully include stakeholders in governance processes, meaning that stakeholder participation is linked to spaces in which definitive decisions are made and outcomes are reached, rather than just deliberation or expression of opinion (Malcolm, 2015).

At present, participation in FWB DAO governance is limited, at best. Proposals are gated by team members who help edit, shape, and craft the language according to a template before it can be posted to Snapshot by the Proposal Review Committee. Members can vote on proposals, with topics including “FWB x Hennessy Partnership,” grant selections, and liquidity management. According to core team members in their public talks, votes typically pass with 99% in favor every time, which is not a good signal of genuine, political engagement and healthy democracy.

 

  1. Is this governance ideal maintained over time?

A criterion missing in the current principles on multistakeholderism is how the ideals of decentralized governance can persist over time. It is widely acknowledged that the Internet model of governance is not congruent with the initial values of some for a ‘digital public’ that has become privatized, monetized, and divisive. These inner power structures controlled by private firms and sovereign States permeate the architectures and institutions of Internet governance (DeNardis, 2014). Some argue that this corrupted ideal over time can be addressed by deprivatizing the Internet to re-direct power away from big tech firms and towards more public engagement and democratic governance (Tarnoff, 2022). In reality, both privatized network governance models and public ones can be problematic (Nabben ,et. al., 2020). The promise of a social DAO, and crypto communities more broadly, is innovation in decentralized governance, to be able to make technical and political guarantees of certain principles.

The ideals of public, decentralized blockchain communities are at risk of following a similar trajectory to the Internet. What began with grassroots activism against government and corporate surveillance in the computing age (Nabben, 2021a), could be co-opted by the interests of big money, government regulation, and private competition (such as Central Bank Digital Currencies, Facebook’s ‘Meta’, Visa and Amex, etc). For FWB to avoid this trajectory of enthusiastic early community to a centralized concentration of power, a long-term view of governance must be taken. This commands deeper consideration and innovation towards pattern language for decentralized governance itself.

Conclusion

Experiencing the governance dynamics of a social DAO surfaces some of the challenges of coordinating the governance and scaling distributed infrastructure that blends multi-stakeholder, online-offline dynamics with the values of decentralization. The goal of FWB DAO is to allow people to gain agency through the creation of new economies that then propagate through cultural influence. This goal must carry through each localization and somehow align back to the overarching DAO as the project scales to create not just culture but to further the cause of decentralization. What remains to be seen is how this creative community can collectively facilitate authentic, decentralized organizing for the impassioned believers through connections, tools, funding, and creative ingenuity on governance itself. Without incorporating the principles of meaningful multistakeholder inclusion in governance, DAOs risk becoming ‘a myth of decentralization’ (Mathew, 2016) that are riddled with power concentrations in practice. The principles of multi-stakeholderism from Internet governance offer one potentially viable set of criteria to guide the development of more meaningful decentralized governance practices and norms. Yet, multistakeholder governance is intended to balance public interests and political concerns in particular contexts, not as a model for all distributed governance functions (DeNardis & Raymond, 2013). Thus, the call to Decentralized Autonomous Organizations is to leverage the insights of existing governance models, whilst innovating their own principles and tools to continue exploring, applying, and testing governance models and authentically pursue their aims.

 


Bibliography

A16Z. (2021). “Investing in Friends With Benefits (a DAO). Available online: https://a16z.com/2021/10/27/investing-in-friends-with-benefits-a-dao/. Accessed October, 2022.

Abbate, J. (2000). Inventing the Internet. MIT Press, Cambridge.

Adams, T. E., Ellis, C., & Jones, S. H. (2017). Autoethnography. In The International Encyclopedia of Communication Research Methods (pp. 1–11). John Wiley & Sons, Ltd. https://doi.org/10.1002/9781118901731.iecrm0011

Alexander, C. 1964. Notes on the Synthesis of Form (Vol. 5). Harvard University Press.

Brummer, C J., and R Seira. (2022). “Legal Wrappers and DAOs”. SSRN. Accessed 2 June, 2022. http://dx.doi.org/10.2139/ssrn.4123737.

Berker, T. Michel Callon, Pierre Lascoumes and Yannick Barthe, “Acting in an Uncertain World: An Essay on Technical Democracy”. Minerva 49, 509–511 (2011). https://doi.org/10.1007/s11024-011-9186-y

DAOstar. (2022). “The DAO Standard”. Available online: https://daostar.one/c89409d239004f41bd06cb21852e1684. Accessed October, 2022.

DeNardis, L. (2013). “The emerging field of Internet governance”. In W. H. Dutton (Ed.), The Oxford handbook of Internet studies (pp. 555–576). Oxford, UK: Oxford University Press. https://doi.org/10.1093/oxfordhb/9780199589074.013.0026.

DeNardis, L. (2014). The Global War for Internet Governance. Yale University Press: New Haven, CT and London.

DeNardis, L. and Raymond, M. (2013). “Thinking Clearly About Multistakeholder Internet Governance”. GigaNet: Global Internet Governance Academic Network, Annual Symposium 2013, Available at SSRN: https://ssrn.com/abstract=2354377 or http://dx.doi.org/10.2139/ssrn.2354377

Epstein, D., C Katzenbach, and F Musiani. (2016). “Doing Internet governance: practices, controversies, infrastructures, and institutions.” Internet Policy Review.

FWB. (2022a). “FWB Fest 22”. FWB. Available online: https://fest.fwb.help/. Accessed October, 2022.

FWB. (2022b). “Kelsie Nabben: What are we LARPing about? | FWB Fest 2022”. YouTube (video). Available online: https://www.youtube.com/watch?v=UUoQ-sBbqeM. Accessed October, 2022.

FWB (n.d.). “Pulse”. FWB. Available online: https://www.fwb.help/pulse. Accessed October, 2022.

Gnosis. (2022). “Zodiac Wiki”. Available online: https://zodiac.wiki/index.php/ZODIAC.WIKI/. Accessed October, 2022.

Gottsegen, W. (2021). Available online: https://www.coindesk.com/tech/2021/10/07/designer-eric-hu-on-generative-butterflies-and-the-politics-of-nfts/. Accessed October, 2022.

Hassan, S., and P. De Filippi. (2021). "Decentralized Autonomous Organization." Internet Policy Review 10, no. 2:1-10.

Jose Meijia, (@JoseRMeijia). (2022). [Twitter]. “This is the way”. Available online: https://twitter.com/makebrud/status/1556691400367824896. Accessed 1 October, 2022.

Mailland, J. and K. Driscoll. (2017). Minitel: Welcome to the Internet. MIT Press, Cambridge.

Malcolm, J. (2008). Multi-Stakeholder Governance and the Internet Governance Forum. Wembley, WA: Terminus Press.

Malcolm, J. (2015). “Criteria of meaningful stakeholder inclusion in Internet governance.” Internet Policy Review, 4(4). https://doi.org/10.14763/2015.4.391

Mathew, A. J. (2016). “The myth of the decentralised Internet.” Internet Policy Review, 5(3). https://doi.org/10.14763/2016.3.425

Nabben, K. (2021a). “Is a "Decentralized Autonomous Organization" a Panopticon? Algorithmic governance as creating and mitigating vulnerabilities in DAOs.” In Proceedings of the Interdisciplinary Workshop on (de) Centralization in the Internet (IWCI'21). Association for Computing Machinery, New York, NY, USA, 18–25. https://doi/10.1145/3488663.3493791

Nabben, K. (2021b). “Infinite Games: How Crypto is LARPing”. CoinDesk. Available online: https://www.coindesk.com/layer2/2021/12/13/infinite-games-how-crypto-is-larping/. Accessed October, 2022.

Nabben, K. (2022). “A Political History of DAOs”. FWB WIP. Available online: https://www.fwb.help/editorial/cypherpunks-to-social-daos. Accessed October, 2022.

K. Nabben, M. Poblet and P. Gardner-Stephen. "The Four Internets of COVID-19: the digital-political responses to COVID-19 and what this means for the post-crisis Internet," 2020 IEEE Global Humanitarian Technology Conference (GHTC), (2020). pp. 1-8, doi: 10.1109/GHTC46280.2020.9342859.

Tarnoff, B. (2022). Internet for the People: The Fight for Our Digital Future. Verso Books: Brooklyn.

Winner, L. (1980). “Do Artifacts Have Politics?” Daedalus, 109(1), 121–136. Retrieved from

http://www.jstor.org/stable/20024652

Zargham, M., and K Nabben. (2022). “Aligning ‘Decentralized Autonomous Organization’ to Precedents in Cybernetics”. SSRN. Accessed June 2, 2022. https://ssrn.com/abstract=4077358.

 

Kelsie Nabben. (November 2022). “Decentralized Governance Patterns: A Study of "Friends With Benefits" DAO.” Interfaces: Essays and Reviews on Computing and Culture Vol. 3, Charles Babbage Institute, University of Minnesota, 86-101.

 


About the Author: Kelsie Nabben is a qualitative researcher in decentralised technology communities. She is particularly interested in the social implications of emerging technologies. Kelsie is a recipient of a PhD scholarship at the RMIT University Centre of Excellence for Automated Decision-Making & Society, a researcher in the Blockchain Innovation Hub, and a team member at BlockScience.


 

 

The Robots Are Among Us and 2062: The World that AI Made
Figure 1: The two books under examination. Picture taken by the author.

I want to share an anecdote. My doctoral fieldwork consisted of a mixed historical analysis and interview-based research on artificial intelligence (AI) promises and expectations. I have been attending numerous talks on AI and robotics while I have been frequently posting on social media about interesting material I encountered during my doctoral investigations. On July 15th 2018, I received a generous gift by mail, sent by a very kind Instagram user named Chris Ecclestone, who, after a brief online chat about my PhD through the platform’s messaging utility, insisted he had to send me something he found at his local charity shop (the charity-oriented UK equivalent of a thrift store/second-hand shop). The book’s title was The Robots Are Among Us, authored by a certain Rolf Strehl, published in 1955 by the Arco Publishing Company.

I was only able to find very limited information about Strehl – the most comprehensive information available online comes from a blogpost written by workers at the computer museum Heinz Nixdorf. From this, we learn, with the aid of online translation from German, that “he was born in Altona in 1925 and died in Hamburg in 1994,” that while writing this book “he was editor-in-chief of the magazine ‘Jugend und Motor’” (‘Youth and Motor,’ a popular magazine about automobiles), and that the book comes with a “number of factual errors” and “missing references.” According to the same website, the original 1952 German version of Die Roboter Sind Unter Ins (Gerhard Stalling Verlag, Oldenburg) was among the first two nonfiction books written about robots and intelligent machines in German, translated into several languages. A quick Google Images search proved that, in addition to my copy of the English translation, the book was also published, with slightly modified titles, in several other languages too: In Spanish (Han Llegado Los Robots – Ediciones Destino, Barcelona), Italian (I Robot Sono Tra Noi – Bompiani Editore, Milan), and French (Cervaux Sans Âme: Les Robots – Editions et publications Self, Paris). This suggests that the book was considered by several international publishers to be credible enough for wide circulation, and as the English version’s paper inlay states, the book “is written with a minimum of technical jargon. It is written for the layman [sic]. It is a scientific book, but it is a sensational book: for it illuminates for us the shape of things to come”;  one has to note the use of the word “sensational” which in current debates about public portrayals of AI, it is mostly used as a derogatory term, implying distance from technical legitimacy).

Thus, I suggest that the book deserves excavation being indicative of the mid-1950s promissory environment around thinking machines, prior to the coinage of the term AI, although the English translation overlaps with the year the term was coined (more below).

On July 9th, 2019, almost a year since I received Strehl’s book, I attended a talk at the University of Edinburgh by Toby Walsh, Scientia Professor of Artificial Intelligence at the University of New South Wales. Walsh, whose doctoral degree was obtained in Edinburgh, presented portions of his 2018 book 2062: The World that AI Made, which I acquired and read after the event. In contrast to Strehl’s rather obscure biographical notes, Walsh’s work is well documented on his personal website. In addition to his AI specialisation in constraint programming, Walsh’s work involves policy advising in building trustworthy AI systems as well as lots of public outreach through popular media.

The book, published in English by Black Inc./ La Trobe University Press, is of similar magnitude to Strehl’s, given that it has been translated widely: In German (2062: Das Jahr, in Dem Die Künstliche Intelligenz Uns Ebenbürtig Sein Wird, Riva Verlag, Munich), Chinese (2062:人工智慧創造的世界經濟新潮社, Taipei), Turkish (2062: Yapay Zeka Dünyası – Say Yayınları, Ankara), Romanian (2062: Lumea Creata De Inteligenta Artificiala – Editura Rao, Bucharest), and Vietnamese (Năm 2062 -Thời Đại Của Trí Thông Minh Nhân Tạo – NXB Tổng Hợp TP. HCM, Ho Chi Minh City).. Taking numbers of translations as an indication of magnitude, I suggest that Walsh’s book can be classified as somewhat comparable to Strehl’s, given that as it is mentioned on his website, it is “written for a general audience.” Thus, regardless the different degrees of AI expertise and respectful contexts of their authors, I suggest these books can be contrasted as end-products indicating AI hype in 1955 and 2018. I hereby aim to recreate my personal experience with discovering the similarities between the two books.

Toby Walsh
Figure 2: Toby Walsh, Photo Credit CC-BY 4.0. Nadja Meister. (link: https://www.flickr.com/photos/vclatuwien/29968223557/)

I now invite the reader to take a look through the tables of contents displayed at the end of this essay, upon which I will now comment. Strehl’s book contents have been scanned from the original whereas Walsh’s expanded book contents have been collated in a way to resemble Strehl’s for ease of comparison. (A note on presentation: As the reader will notice, Strehl’s chapters are followed by detailed descriptions of the chapters’ sections, very typical of books from that era. Walsh’s original table of contents includes only the main headings, although within the book similar sections to Strehl’s are designating sub-chapters. I have manually copied these sub-heading structure on the table below in lieu of scannable content.) Notice similarities on both books’ first chapters, between “the failure of the human brain – the machine begins to think – ‘Homunculus’ is born – the beginning of a new epoch” (Strehl 1955) and “Machines That Learn – Computers Do More Than They are Told – The Machine Advantage – Our Successor” (Walsh 2018).

Both books’ second chapters review technological advances of machine intelligence of their times: Strehl describes the abilities of early computing and memory storing machines ENIAC, Mark III, and UNIVAC, as well as the possibility of “automatic weapons.” Meanwhile, Walsh describes recent breakthroughs in game-playing such as Go, although his chapter 0005 is entirely dedicated to “killer robots,” “weapons of terror,” and “of error,” pretty much like Strehl’s penultimate chapter “The Beginning of the Future War” which contains sections like “Robots become soldiers,” “mechanical brains take over command.” (Interestingly, Walsh does not refer to any cases of factory worker accidents by robotic technologies, however Strehl mentions two cases of lethal robotic accidents on this chapter’s section “A Robot murders its Master” similar to newspaper headlines about robotic killers (for example, Huggler 2015, McFarland 2016, or Henley 2022).

Strehl’s third chapter asks, “Can the Robot Really Think?” in the same way that Walsh asks, “Should We Worry?” Both authors enquire on “The Age-Old Dream of an Artificial Human Being” (Strehl) and “Artificial General Intelligence – How Long Have We Got? – The Technological Singularity” (Walsh); and again, both refer to the question of “free will” in machines (Strehl: “Free will of the Machine?”; Walsh: “Zombie Intelligence […] The Problem of the Free Will”). Strehl dedicates two chapters to job displacement, “The Second Industrial Revolution,” focusing on industrial robotic technologies (“the Robots are in control – machine automatons replace worker battalions – Man [sic] is left out […] the factory without people”) and “The Dictatorship of the Automaton” mostly focusing on automation technologies conceptually similar to AI (“the automatic secretary […] the telephone Robot listens attentively – Robots keep books conscientiously – Robots sort telegrams […] the Robot as master detective […] the whole of mankind [sic] is filed – Robot salesmen [sic] in the department store […] divorce by ‘automatic’ court decision”). Although today’s equivalents (robot assistants like Alexa or Echo, robotic “judges,” and concerns about data surveillance) are much more technologically advanced, the sentiment captured in Strehl’s book is strikingly similar to several sections on Walsh’s: “The Real Number of Jobs at Risk, “Jobs Only Partly Automated – Working Less” (on the dangers of job automation), “Machine Bias – The Immoral COMPAS – Algorithmic Discrimination” (on the cases of automated decision making as in the robotic judge COMPAS), and “AI is Watching You – Linked Data” (on the case of surveillance).

The Robots Are Among Us toc
Figure 3: The Robots Are Among Us Table of contents image of the 'Modern Man in the wheels of technique.'

By this point, it has become sufficiently clear that concerns about automation technologies which in different times can be termed as “AI” or “robots” (or different regional and research contexts; consider the “I’m not a robot” captcha version of a Turing test) have been sustained in a surprisingly similar degree of comprehension. It should be interesting to note some differences between the two books. First, it is useful to question how the authors gain what we might perceive as their promissory credibility, that is, the right to speculate about a new form of reality which is about to come. As already mentioned, Strehl falls short in terms of references – however, he sets out to clarify that the content presented is realistic: “This book is not about Utopia. It is a factual report of the present time collected from hundreds of sources. Nevertheless, throughout his book, Strehl refers to warnings about machine intelligence expressed by pioneering minds in the field, often citing cyberneticist Norbert Wiener, but also mathematician Alan Turing, and others. Walsh’s approach is stricter, methodologically speaking, matching contemporary standards:

 

“In January 2017, I asked over 300 of my colleagues, all researchers working in AI, to give their best estimate of the time it will take to overcome the obstacles of AGI. And to put their answers in perspective, I also asked nearly 500 non-experts for their opinion. […] Experts in my survey were significantly more cautious than the non-experts about the challenges of building human-level intelligence. For a 90 per cent probability that computers match humans, the median prediction of the experts was 2112, compared to just 2060 among the non-experts [...] For a 50 per cent probability, the median prediction of the experts was 2062. That’s where the title of this book comes from: the year in which, on average, my colleagues in AI expect humankind to have built machines that are as capable as humans.” (AGI stands for Artificial General Intelligence, that is, the hypothesis that AI might be reaching or surpassing human intelligence, for example Goertzel 2014.)

 

Although the two authors exhibit different strengths in showing their research skills, they both rely on the credence of external sources to sustain their argument. Moreover, they agree on the possibility of a rather inevitable new form of world which is, in part, already here, and will invite humanity to think of new forms of living in the nearby future. Their difference is in their degree of optimism. Strehl agrees with Walsh that machines will always remain in need of human controllers, however, suggests that machines will take control in a subtler way:

 

“Man [sic] will try to maintain his [sic] supremacy because the machines will always be limited creatures, without imagination and consciousness, incapable of inventiveness outside their own limits. But this supremacy of Man [sic] will only be an illusion, because the machines will have become so indispensable in an unimaginable mechanization of the technical civilization of the future that they will have become the rulers of this world, grown numb through technical perfection. The future mechanized order of society will not be able to continue existing without constant supervision of the thinking machines by their human creators. But the machines will rule.”

 

The following, more optimist, passage by Walsh can be read as a hypothetical response to Strehl:

 

“But by 2062 machines will likely be superhuman, so it’s hard to imagine any job in which humans will remain superior to machines. This means the only jobs left will be those in which we prefer to have human workers.”

 

Walsh then refers to the emerging hipster culture characterised by appreciation of artisan jobs, craft beer and cheese, organic wine, and handmade pottery.

One should not forget that Walsh’s public outreach on AI extends in part from his lens as an AI researcher. His book is one that admits challenges, but also offers hopeful perspectives. Strehl’s book is written in a rather polemic fashion, although admitting the author’s fascination about the technical advancements; yet it is written by an outsider who has probably not built any robots, at least as much as Walsh has developed algorithms. This difference of balance, small doses of warning followed by hopeful promising (Walsh) is opposed to small doses of excitement followed by dystopian futurism (Strehl). It is telling of the existence of the expectational environment of AI which evolves at least since the second half of the 20th century, with its roots in the construction of early automata, as well as in mythology, religion, and literature.

Strehl’s book can be classified as indicative of broader narratives circulating which might have influenced decisions within the domain of practice, although it is difficult to find evidence and make robust assumptions concerning ways in which such broader public narratives about robots, thinking machines, and how electronic brains have influenced the practical direction of research. Walsh’s book can be classified as a product of internal research practices and strategies, aimed at influencing broader narratives (the book’s popularity might be considered as evidence of some sort). The mutual themes between the books prove that the field (or vision) of intelligent machines (hereby examined as AI) is at the same time broad, yet recognisable and limited in its various instantiations, from automated decision-making to autonomous vehicles.

In this essay, I do not want to make another claim about history repeating itself and the “wow” effect of hype-and-disillusionment cycles – belief in a purely circular history is as reductionist as the belief in the modernist notion of linear progress and innovation. This instance of repetition of themes is not a call for a same old-same old caution that AI warnings are of no value because humanity’s previous experience proved so. It is, however, a call to raise awareness about hype, sensitivity about sensationalism, and to treat products of mass consumption about science and technology as artefacts produced by specific and variable social contexts on the micro-scale (such as institutional agendas) and rather generalised and constant aspects of psychological patterns on the macro scale: hope and fear. In 1980, Sherry Turkle concluded that the blackbox structure of computers, invoke to their users different projected versions of their optimism or pessimism, thus resembling inkblot tests; she thus treated “computers as Rorschach.” 42 years after Turkle’s paper, computers and robots have evolved a lot – however, despite numerous calls for explainable AI systems, nothing prevents us from treating “AI as Rorschach” as well as “robots as Rorschach.” This might amount to a creative and therapeutic endeavour in our experience with AI.

Robots among us toc image toc image 2
Figure 4:  Rolf Strehl, The Robots Are Among Us table of contents.
Robots are among us ToC sub-sections
Figure 5 Toby Walsh, 2062: The World That AI Made, table of contents, incl. sub-sections not available on the printed table of contents page.

Bibliography

Goertzel, Ben. (2014). Artificial general intelligence: concept, state of the art, and future prospects. Journal of Artificial General Intelligence, 5(1), 1-46.

Heinz Nixdorf MuseumsForum (2017). Die Roboter Sind Unter Ins. Blog post. November 7, 2017. Retrieved 18-06-2021 from: https://blog.hnf.de/die-roboter-sind-unter-uns/

Henley, Jon (2022, July 24). Chess robot grabs and breaks finger of seven-year-old opponent. The Guardian. 24-07-2022. https://www.theguardian.com/sport/2022/jul/24/chess-robot-grabs-and-breaks-finger-of-seven-year-old-opponent-moscow

Huggler, Justin. (2015, July 2). Robot Kills Man at Volkswagen Plant in Germany. The Telegraph. Retrieved 3-07-2015 from http://www.telegraph.co.uk/news/worldnews/europe/germany/11712513/Robot-kills-man-at-Volkswagen-plant-in-Germany.html

McFarland, Matt. (2016, July 11). Robot’s Role in Killing Dallas Shooter is a First. CNN Tech. Retrieved 29-04-2017 from http://money.cnn.com/2016/07/08/technology/dallas-robot-death/index.html

Strehl, Rudolf. (1952 [1955]). The Robots are Among Us. London and New York: Arco Publishers.

Turkle, Sherry. (1980). Computers as Rorschach: Subjectivity and Social Responsibility. Bo Sundin (ed.). Is the Computer a Tool? Stockholm. Almquist and Wiksell. 81–99.

Walsh, Toby. (2018). 2062: The World that AI Made. Carlton: La Trobe University Press, Black Inc.

Walsh, T. (2021, July 20th). Personal website. UNSW Sydney, accessed 20, July 2021, <http://www.cse.unsw.edu.au/~tw/,>

 

Vassilis Galanos (October 2022). “Longitudinal Hype: Terminologies Fade, Promises Stay – An Essay Review on The Robots Are Among Us (1955) and 2062: The World that AI Made (2018).” Interfaces: Essays and Reviews on Computing and Culture Vol. 3, Charles Babbage Institute, University of Minnesota, 73-87.


About the Author: Vassilis Galanos (it/ve/vem) is a Teaching Fellow and Postdoctoral Research Associate at the University of Edinburgh, bridging STS, Sociology, and Engineering departments and has co-founded the local AI Ethics & Society research group. Vassilis’s research, teaching, and publications focus on sociological and historical research on AI, internet, and broader digital computing technologies with further interests including the sociology of expectations, studies of expertise and experience, cybernetics, information science, art, invented religions, continental and oriental philosophy. 


 

 

christian-erfurt
Photo by Christian Erfurt on Unsplash.

Recently, the term “Quiet Quitting” has gained prominence in social media by employees who are changing their standards about work, and by business leaders who are concerned about the implications of this change of attitude and expectations at the workplace. The term initially started trending in social media with the posts from employees sharing their perspective. These employees are vocal about changing the standards of achievement and success at work, especially when work and home boundaries are no longer clear.

Quiet Quitting is a call from employees who still value their work but also wanted to feel valued and trusted in return. This is a call from those whose work and personal life is not balanced and who are looking for a healthier way to set boundaries. This is a reaction to the changes caused by the pandemic which allowed some employees to work from home, but which also further blurred the lines between work and home space. This is about corporations finding a multitude of ways to ensure their employees are connected to work around the clock, and is about the workers not wanting to be available to their employers for time for which they are not compensated, or work for which they are not recognized. It should not be a reason to criticize, shame, scare or surveil employees.

The pandemic caught many organizations unprepared for a sudden shift to remote work arrangements. Employers who were worried about the performance levels of their now-remote workers implemented several measures – some more privacy-invading than others. Unfortunately, for many companies, the knee-jerk reaction was to implement employee monitoring (or surveillance) software, sometimes referred to as ‘bossware’. Vendors selling this software tend to pitch their products as capable of achieving one or more of the following: ‘increase in productivity/performance; prevention of corporate data; prevention of insider threat; effective remote worker management; data-based decision-making on user behavior analytics; sentiment analysis to identify flight-risk employees.” The underlying assumptions of this thread of functions are:

  • employees cannot be trusted and left to do what they are hired to do;
  • human complexity can be reduced to some data categories; and
  • a one-size-fits-all definition of productivity exists, and the vendor’s definition is the correct one.

In response to employees who suggest they will only do what they are hired to do and not more until the expectations are changed, AI-based employee surveillance systems are now being discussed as a possible solution to those ‘Quiet Quitting’. Employee surveillance was never a solution for creating equitable work conditions or increasing performance in a way which respected the needs of employees. It certainly cannot be a solution to the demands of workers trying to stay physically and mentally healthy.

christian-erfurt
Photo by Chris Yang on Unsplash. ​​​​

The timing of tasks and employee activity monitoring in assembly lines and warehouses goes back to the times of Winslow Taylor. Taylorism aimed to increase efficiency and production and eliminate waste. It was also based on the “assumptions that workers are inherently lazy, uneducated, and are only motivated by money.” Taylor’s approach and practice has been brought to its contemporary height by Amazon with its minute-by-minute tracking of employee activity and termination decisions made by algorithmic systems. Amazon uses such tools as navigation software, item scanners, wristbands, thermal cameras, security cameras and recorded footage to surveil its workforce in warehouses, delivery trucks and stores. Over the last few years, employee surveillance practices have been spreading into white and pink-collar work too.

According to a recent report by The New York Times, eight of the ten largest private U.S. employers track the productivity metrics of individual workers, many in real time. The same report details how employees described being tracked as “demoralizing,” “humiliating” and “toxic” and that 41% of employees reporting nobody in their organization communicates with them about what data is collected and why or how it’s being used. Another 2022 report by Gartner shows the number of large employers using tools to track their workers doubled since the beginning of the pandemic to 60%, with this number expected to rise to 70% within the next three years.

carl-heyerdahl
Photo by Carl Heyerdahl on Unsplash.

Employee surveillance software is extensive in its ability to capture privacy-invading data and make spurious inferences regarding worker performance. The technology can log keystrokes or mouse movements; analyze calendar activity of employees; screen emails, chat messages or social media for both the activity intervals and content; take screenshots of the monitor at random intervals; analyze which websites employee has visited and for how long; force activations of webcams; and monitor the terms searched by the employee. As an article in The Guardian on AI-based employee surveillance tools explains, the concerns regarding the use of these products range from the obvious privacy invasion in one’s home to reducing workers, their performance, and bodies to lines of code and flows of data which are scrutinized and manipulated. Systems which automatically classify a worker’s time into “idle” and “productive” reflect the value judgments of their developers about what is and is not productive. An employee spending time at a colleague’s desk explaining work or mentoring them for better productivity, can be labeled by the system as “idle”.

Even though natural language processing is not capable of understanding context, nuance or intent of language, AI tools which analyze the content and tone of one’s emails, chat messages or even social media posts ‘predict’ if a worker is a risk to the company. Forcing employees who work from home to keep their camera on at all times can lead to private and protected information of the employee to be disclosed to the employer. Furthermore, these systems remove basic autonomy and dignity at the workplace. They force employees to compete rather than cooperate and think of ways to game the system rather than thinking of more efficient and innovative ways to do their work. A CDT report focuses on how bossware can harm workers’ health and safety by discouraging and even penalizing lawful, health-enhancing employee conduct, enforcing a faster work pace and reduce downtime, which increases the risk of physical injuries, and increasing risk of psychological harm and mental health problems for workers.

Just as employee surveillance cannot replace trusting and transparent workplace relationships, it cannot be a solution to Quiet Quitting. Companies implementing such systems do not understand the fundamental reasons of this call. The reasons for such a call are not universal and there is no single solution for employers. The responses may change from fairer compensation to better communication practices to investment into employee’s skills to setting boundaries between work and personal life. Employers need to create space for open communication and understand the underlying reasons for frustration and the call for change. Employees need to ‘hear’ what their employees are telling them, not surveil.

==============================

Disclosure: The author also provides capacity building training and consulting to organizations for AI system procurement due diligence, responsible design, and governance. Merve Hickok is a certified Human Resources (HR) professional with 20 years of experience, an AI ethicist and AI policy researcher. She has written extensively about difference 
sources of bias in recruitment algorithmsimpact on employers and vendorsAI governance methods; provided public comments for regulations in different jurisdictions (New York City Law 144California Civil Rights CouncilWhite House Office of Science and Technology RFI), co-crafted policy statements (European Commission) and contributed to drafting of audit criteria for audit of AI systems (ForHumanity), and has been invited to talk in a number of conferences, webinars and podcasts on AI and recruitment, HR technologies and impact on candidates, employers, businesses and future of work.; was interviewed by both HR professional organizations (SHRM NewsletterSHRM opinion pieces) and by newspapers (The Guardian
) about her concerns and recommendations.


Bibliography

Bose, Nandita (2020). “Amazon's surveillance can boost output and possibly limit unions.” – study. Reuters, August 31.

Corbyn, Zoe (2022). “‘Bossware is coming for almost every worker’: the software you might not realize is watching you.” The Guardian, April 27.

Kantor J, Sundaram A, Aufrichtig A, Taylor R. (2022). “The Rise of the Worker Productivity Score.” New York Times, August 14.

Schrerer, Matt and X. Z. Brown, Lydia (2021). “Report – Warning: Bossware May Be Hazardous to Your Health.” CDT, July 24. Turner, Jordan (2022). “The Right Way to Monitor Your Employee Productivity.” Gartner. June 09.

Williams, Annabelle (2021). “5 ways Amazon monitors its employees, from AI cameras to hiring a spy agency.Business Insider, April 5.  

Wikipedia. Digital Taylorism. https://en.wikipedia.org/wiki/Digital_Taylorism.

 

Merve Hickok (September 2022). “AI Surveillance is Not a Solution for Quiet Quitting.” Interfaces: Essays and Reviews on Computing and Culture Vol. 3, Charles Babbage Institute, University of Minnesota, 65-72.


About the Author: Merve Hickok is the founder of AIethicist.org. She is a social researcher, trainer, and consultant, with work intersecting AI ethics, policy and regulation. She focuses on AI bias, impact of AI systems on fundamental rights, democratic values, and social justice. She provides consultancy and training services to private & public organizations on Responsible AI, to create awareness, build capacity, and advocate for ethical and responsible development, use and governance of AI. Merve is also Data Ethics Lecturer at University of Michigan, and Research Director at Center for AI & Digital Policy. The Center educates AI policy practitioners and advocates across ~50 countries and leads a research group which advises international organizations (such as European Commission, UNESCO, Council of Europe, etc) on AI policy and regulatory developments. Merve also works with several non-profit organizations globally to advance both the academic and professional research in this field for underrepresented groups. She has been recognized by a number of organizations - most recently as one of the 100 Brilliant Women in AI Ethics™ – 2021, and as Runner-up for Responsible AI Leader of the Year - 2022 (Women in AI). Merve was previously a VP at Bank of America Merrill Lynch, and held various senior Human Resources roles. She is a Senior Certified Professional by SHRM (Society of Human Resources Management).


 

 

Cover of Ebony magazine featuring African American boy and his computer circa 1969.
Figure 1: Robert Dodson who was featured as the 'Computer in Whiz Kid' in the December 1969 issue of Ebony Magazine.

 

“The Computer Whiz Kid

The ability to comment, like, and share stories is a powerful feature of our digital world. Small stories online can gain big attention through the conversations they inspire. However, in the 1970s, long before influencers and their platforms of choice, the internet and publicly used digital networking systems, Anita Taylor wrote a letter. Her “letter to the editor,” a short note of less than 100 words was published in the March 1970 issue of Ebony Magazine

 

I have read the article Computer Whiz Kid (December 1969) numerous times. It is so encouraging to hear of a black youth achieving such high goals, especially for those of us who live in the Deep South. Its this sort of inspiration that is needed to give us hope and faith.

I am a Junior in high school. At present, Im enrolled in a chemistry course and physics will probably appear on my schedule for the next session.

--Anita Taylor  

 

Anita Taylor, much like a high schooler might today, was commenting on a story that inspired her. “The Computer Whiz Kid,” a “lanky Chicago teenager” named Robert Dodson, motivated Anita to act by showing her what was possible. Her response to the article indicates to us how literature served as a method for connecting readers and symbolic individuals as members of a single community during the mid-20th century. Dodsons story, with its crisp photographs and clear messages of black success, progress, and creative technical ability, entered the homes of thousands of black families across the nation. In Ebony Robert Dodson as The Computer Whiz Kid” was a symbol providing “hope” and faith” to an ever-growing audience (101-104).

Robert Dodson is also an example of how the work of different communities of knowledge and action came together in a way that shaped an individual. Dodsons presence on the campus and in a mixed dorm at the Illinois Institute of Technology represents a point in history where the work of civil rights activists and black betterment organizations were pushing up against the troubled history of segregation and unequal opportunity in Chicago. In 1969 he was a freshman at the Illinois Institute of Technology. Thirty years before his story, the Illinois Institute of Technology launched an architectural expansion and urban renewal projects that resulted in the removal of land wealth from segregated black communities who had built for themselves a city within the city. Growing up in the shadow of the Institute, Dodsons hobby of choice, building and programming his computers, provided a retreat from the gang activity inundating his neighborhood while connecting him to the product of knowledge communities far away. Cambridge University in 1966 published the book he used to build his computer called We Built Our Own Computers. The book was designed to explain some of the ideas and uses of computers to the intelligent and interested schoolboy who, it is hoped, will then be stimulated to design and build his own computing machines” (Bolt and Harcourt-xi). To make his machine, Dodson enlisted the help of family, and he also used a brief internship at the North American Company for Life and Health Insurance to play with bigger and stronger versions of what he was making at home on the dining room table.

 

The cover of the Cambridge University  book "We Built Our Own Computers' from 1966.
Figure 2: Cover of the book We Built Our Own Computers, published by Cambridge University in 1966.

Dodsons Computer, a Group Project

Dodsons success at transforming a hobby (building computers) into a potential career was the product of different communities who unintentionally collaborated to make a future in computing possible for him. The social-work focused, and volunteer-powered Chicago Urban League contributed to the integration of the Illinois Institute of Technology and possibly connected Dodson to education and employment opportunities. At Cambridge University, educators wanted to share computing knowledge with American youth, so they wrote and published educational books to do just that. Some other communities that contributed to Dodsons success include groups of teachers and librarians who encouraged Dodson; vendors of electrical parts and pieces, who supplied the bits needed to build a computer; and college admission personnel who interviewed then admitted Dodson. Additionally, the finance and housing institutions, which made higher education economically possible for Dodson and the healthcare workers, who supported his fitness and readiness for dorm life, were all communities contributing to his success.

With so many communities” influencing the opportunities possible for an individual, it can be challenging to discern what makes a community, what communities’ matter for a history of computing, and the relationship between the community and the individual. However, as a technology that constructed the symbolic narrative of Dodson and connected that narrative to a black audience, publications like Ebony Magazine were technologies that held African-American computing communities together. Dodson, through Ebony, could influence readers like Anita Taylor to act on their dreams and work towards their high goals.” Likewise, through the story of Dodson, a special kind of black individual was constructed. This individual, not always a Computer Whiz Kid,” was to be emulated by the reader, reproduced in black society, and was shared by the media repeatedly. Looking at media made for and by black people, African American computing communities consisted of the audience for The Computer Whiz Kid” and people like Robert Dodson, who allowed their stories to be shared. African American computing communities also include the organizations that decided that the black readers of Ebony in 1969 needed to meet The Whiz Kid.”

Civil rights and black betterment organizations fighting for equality and freedom sought to create symbols of black defiance, hope, future, and success. An example of their work is Rosa Parks, who be- came a symbolic individual representing defiance and resistance to the system of inequality that bolstered the segregation system but not by chance. In 1955, civil rights organizers from the NAACP waited to find the right person to build a public legal case around. Rosa Parks was not the first person of African descent to be arrested in Montgomery, Alabama, for disobeying segregation bus laws. She was, however, the one that civil rights organizers identified as being best suited for the spotlight (Tufekci 62-63).

Figure 3: The short file A Morning for Jimmy, produced by the National Urban League and Association Films, Inc., tells the story of an African American youth named Jimmy who encounters racial discrimination in his search for employment.

African American computing communities consisted of an audience hungry for a better future, civil rights and black betterment organizations fighting to make opportunity possible, and the black press that deliberately connected audience and organizers to improve the status of black people in America. As the elements that comprise African American computing communities, audience, media, symbolic individuals, and civil rights organizing are also characteristic in the history of other black communities. The literature on black labor, media, activism, class, and culture of the 19th and 20th centuries purposes that the large collective African-American community” was formed out of smaller communities (in fields of work, in neighborhoods, on HBCU and college campuses). These smaller communities networked for full citizenship, creating cultural products (literature, language, attitudes) that organized black people nationally into a people with a distinct voice in American history. First shut out of mainstream society by racist classifications as other than American, human, and citizen, the response to this willful stifling of black futures found in histories and legacies of inequality shaped black people into a demographic with unique language, culture, and politics (Foner and Lewis 511).

When not allowed to live as full citizens, organizing for the betterment of the Negro race” this became the missions of societies that made minority betterment the ultimate point of organizing. Professional and social organizations like Brotherhood of Sleeping Car Porter, the NAACP, The National Technical Association, Alpha Phi Alpha, Kappa Alpha Psi, Alpha Kappa Alpha, and Delta Sigma Theta have shaped the image of African-American as one that is not peripheral to the project of America. By doing so, their missions are entwined in the history of technology in America. Connected by the goal of bringing black people into the project of America,” a project shaped by innovation and a distinct spirit of rugged individuality and materiality, they sought to democratize the labor and culture of technology. No longer would the machinery that powers America be inappropriate and inaccessible for black people because of race: the future these communities fought for was one where black people could be both black and American, black and skilled, black and professional, black and technical, and black and middle-class.

The Black Press       

African American communities of computing, like other groups in the history of computing, are made of writers, doers, and readers. Not just the remarkable men and women who fought to succeed, but the communities they belonged to and the conversations and messages they were a part of. All members of black computing communities were connected by automatic second-class status, where they were locked out, misrepresented, and stereotyped in the mainstream press. In tune with the needs of its audience, black print media was the most influential information medium for black people. This media amplified the voice of the people while explaining what the world of war, of technology, of business was and what it could mean to them. The black press, known as the "fighting press," utilized information technology to connect members in different communities for common goals or shared interests (O’Kelly 13).

In general, magazines, newspapers, and other print media are forms of public discourse that allow readers to engage with ideas, both old and new. Print media disseminates ideas by using the language and values that matter to the audience of the magazine. Language and values can be common sense beliefs regarding fairness, citizenship, and usefulness. Print media uses "frames" or the principles that organize information by referencing the social structures and realities, real or imagined, that matter to an individual or an audience. In this history, the frames used by the black press were ones that focused on the reality of black life in America: segregation and second-class status. Magazines organize information into frames so that the content is not disconnected from the social understanding of readers. This organization helps readers make sense of the new, by grounding the unknown in the familiar and "correct." When the content of magazines is computer technology, "common sense" values and power dynamics are embedded in how these new technologies are contextualized for audiences. Black newspapers framed the computer as a tool for black freedom by focusing on skill, education, professionalization, class, and materiality - issues that were already in the minds of the black public.

Looking away from black media, toward what could be called "mainstream media," the result of frames for technological diffusion are stories of computers that show them to be hosts for useful activities and social evolutions. A quick historical narrative of this framing between the 1950s to the present day found in "mainstream" magazines shows that what computers are and can-do changes as their technical capabilities develop and audiences become more familiar. The frames used to describe computers found in business magazines in the 1950s generally describe them as calculators useful for processing numerical data. Eventually, computers become more than just a calculator but a way to improve speed and efficiency-a tool for management. They are giant, powerful brains that threaten to replace workers in an expanding range of fields. To a different audience and a more advanced computer, they are not just computers but hobbies and toys. As computers become "personal" in the 1980's they are not only a computer but an extension of individuality, independence, and creativity (Cogan 248-265). By the end of the 20th century, the ability to set up networks through personal computers makes them not only computers but communication devices that are part of a global network of information sharing. As computers travel and find homes in communities of color in the U.S. and globally, they become more than just a computer but tools for development and participation in the global information economy.

The frames that reference the values, fears, truths, and realities African Americans in the century were notably different from those of their white counterparts. Likewise, print media tells us how computers were incorporated into African American life during this time and why they were incorporated differently than the mainstream publications usually studied. Not to say that black people would not have read magazines like Time or Newsweek; however, the frames used by mainstream publications were not concerned with the black perspective, thus creating the need for a black press.

Black newspapers shared the good news of opportunity, while not ignoring the harsh realities of America's racialized labor economy, they also offered "what if" scenarios. In the New Pittsburgh Courier, September 06, 1969, as a letter to the editor, Jesse Woodson JR proposed a solution for the criminal justice problem:

 

Dear Editor:
In view of the present inequalities which exist in the detection, prosecution, and confinement of black criminals vs. white, I think the black man would receive a great deal more fairness and impartiality from an IBM computer.

First, identify and catalog the various crimes. Next, edit the trials of the various city, state, and government courts over the past 25-years. Include all facts concerning investigations acquittals and convictions, plus the day dialogue of both the defense and the trial cancels. The computer when programmed with the above information would then be capable of rendering a decision based on the aggregate experience of the nation's law interpreters.

This decision would not take into consideration race or background. However, as likely as not, so some Mississippi court will discover the need for one computer for white and another for blacks.

--Jesse Woodson JR

 

What if a computer could solve the problem of racism in America? What if "race and back- ground" was irrelevant in the new technical order? We now know that computers are not impartial arbiters, and people have yet to successfully exclude race and class from computerized decision-making systems. In 1969, however, Jesse Woodson and those who read his suggestion, were mentally experimenting with the computer as a tool for freedom from prejudice and mainstream connection between black and criminal. Even this ideal "what if" came with skepticism as Woodson notes that the racist system could corrupt even "fair" computing decision-making systems in which they operate.

Conclusion

From the 1940s to the 1980s, emissaries like "The Whiz Kid,” ventured into the slowly integrating universities and offices of the information age. They ventured out, but they also reported back. Through black media, they communicated what computing meant for black people, and, as skilled workers in the new computer age, they embodied the characteristics of success. Through the technologies of storytelling, their image and traits became ingrained in community memory as necessary for the future. Because of them and the machines they controlled, new symbolic identities were formed, dismissed, and became immovable, stretching what held a community together across lives and worlds unique to the imaginations of its members.


Bibliography

Aspray, William and Donald Beaver. (1986). "Marketing the Monster: Advertising Computer Technology," Annals of the History of Computing, vol. 8, no. 2, pp. 127-143. doi: 10.1109/MAHC.1986.10038.

Bolt, A. B., Harcourt, J. C., Hunter, J. (1966). We Built Our Own Computers. Cambridge: Cambridge University Press.

Boyd, M. (2008). Jim Crow Nostalgia: Reconstructing Race in Bronzeville. Minneapolis: University of Minnesota Press.

Brown, Tamara, Gregory Parks, and Clarenda Phillips. (2012). African-American Fraternities and Sororities: the Legacy and the Vision, 2nd ed, Lexington: University Press of Kentucky.

Cogan, Brian. (2005) Framing usefulness: An examination of journalistic coverage of the personal computer from 1982–1984,” Southern Journal of Communication, vol. 70, no. 3, pp. 248-265. doi: 10.1080/10417940509373330.

Foner, Philip and Ronald Lewis. (1983/2019) "The Black Worker from the Founding of the CIO to the AFL-CIO Merger, 1936-1955.Philadelphia: Temple University Press, pp. 511.

Gibbons, Kelcey. (2022). Inventing the Black Computer Professional. In J. Abbate and S. Dick (Eds.), Abstractions and Embodiments: New Histories of Computing and Society (pp. 257-276). Johns Hopkins University Press.

McDonough, John and Karen Egolf. (2003). Computers, In The Advertising age encyclopedia of advertising, New York: Routledge.

O'Kelly, Charlotte. (Spring 1982). "Black Newspapers and the Black Protest Movement: Their Historical Relationship, 1827-1945.Phylon, vol. 43, no. 1, pp. 13.

Taylor, Anita. (March 1970). Computer Whiz Kid,Ebony, pp. 17.

Taylor, Anita. (December 1969). Computer Whiz Kid,Ebony, pp. 101-104.

Tierney, Sherry. (2008). Rezoning Chicago's Modernisms: 1914–2003,” (Master Thesis., Arizona State University), 6-99.

Tufekci, Zeynep. (2017). Twitter and Tear Gas: The Power and Fragility of Networked Protest. New Haven: Yale University Press.

Woodson JR, Jesse. (September 1969). Job for Computer.New Pittsburgh Courier, 14.

 

Kelcey Gibbons (August 2022). “Framing the Computer.” Interfaces: Essays and Reviews on Computing and Culture Vol. 3, Charles Babbage Institute, University of Minnesota, 54-64.


About the Author: Kelcey Gibbons is a PhD student at MIT in the History, Anthropology, Science, Technology and Society. She studies the history of the African American experience of technology with a focus on engineering and computing communities of the late 19th through to the mid 20th centuries. 


 

 

With its tendency to grip popular imaginaries and utopian fantasies, artificial intelligence has crystallized the enduring hope for easy technological solutions to the world’s greatest problems and fears (Haigh & Ceruzzi, 2021; Plotnick, 2018). It has been hailed as “the magic wand destined to rescue the global capitalist system from its dramatic failures” (Brevini, 2022, p. 28), and has been positioned as the linchpin of modern civil society. But, while developments in artificial intelligence technologies are commonly considered among the most important factors in shaping the modern condition, they also have exacerbated inequality, ushered in a new era of discrimination (D’Ignazio & Klein, 2020; Benjamin, 2018; Radin, 2017), left irreversible environmental damage (Brevini, 2022; Dauvergene, 2021), worsened labour struggles (Frey, 2021; Pasquale, 2020; Gray & Suri, 2019), and concentrated power – and wealth – in the hands of the privileged elite (Brevini, 2022; Crawford, 2020; Frey, 2019). As such, critically studying artificial intelligence requires a multifaceted understanding of it as being both controllable and controlling, dependent and autonomous, minimal and robust, submissive and authoritative, and determined and determinable.

To fully understand these binaries and their implications, artificial intelligence research undertaken in the humanities and social sciences warrants a long-term, historical approach that views artificial intelligence in the broader context of technological development, including the social, political, environmental, and cultural forces impacting it. This is especially the case given that the so-called “artificial intelligence boom” in academia has led to a bias towards works published in the last couple of years. But, if properly informed by the past, artificial intelligence research is more likely to prepare users for the future while also shedding light on the ways that we must act differently in the face of technological change.

Cloenda Summit Digital Future Society: Mind Map
Figure 1: Cloenda Summit Digital Future Society: Mind Map.
Courtesy of Barcelona.cat, CC BY-NC-ND 2.0.
 

​​​​​Technology development and usage carries the imprint of political, ontological, and epistemological ideologies, such that every modern technology, including and especially artificial intelligence, is an infinitesimal representative of not just what users know, but how users come to know. Insofar as the humanities and social sciences are interested in technology as an instigator of cultural change, these disciplines must centralize its historical and epistemological dimensions, and investigate how, at every major historical moment in the development of modern technology and artificial intelligence/computational systems, users have adapted to new forms of knowledge-making.

Although most research in humanities and social sciences exhibits some kind of historical immediacy, it tends to be detached from larger epistemological considerations that align with major historical moments of change. Understanding, at each major technological juncture, how technology users come to know, may be crucial to developing better knowledge about technology (including artificial intelligence), its users, and the world.

This research would involve a multifaceted, interdisciplinary methodology that is both “anti-modern” and philosophical. Edwards (2003), for example, suggests that any historical and archival method to technological inquiry necessarily avoids falling into the trap of “technological determinism” that plagues so much current artificial intelligence research, especially those conducted through short-term analyses. Selective attention primarily to the “modern” aspects of infrastructures can produce blindness to other aspects that may, in fact, be “anti-modern”; as Golumbia (2009) contends, irrespective of “new” technologies, human societies remain bound by the same fundamental forces that they always have been, so technological changes are best seen as shifts in degree rather than in kind. For this reason, technology ought to be assessed with reference to the past, especially because the computer apparatus leaves “intact many older technologies, particularly technologies of power, yet puts them into a new perspective” (Bolter, 1984, p. 8).

US Navy Cryptanalytic Bombe
Figure 2: US Navy Cryptanalytic Bombe.
Courtesy of brewbooks, CC BY-SA 2.0.

This approach to artificial intelligence research would model a different kind of temporal orientation for the humanities and social sciences that is rooted in the recognition that both ethereal, “cloud-like” technologies and “resolutely material media” (Mattern, 2017) have always co-existed. Because the old and the new necessarily overlap, it is important to draw archival linkages to produce more precise and comprehensive evaluations of technology and technological change. As Chun (2011) notes, new media races simultaneously towards the future and the past in that “the digital has proliferated, not erased, [historical] media types” (p. 11, 139).

An historical way forward may also be key to confronting and dismantling algorithmic coloniality, the idea that colonial logics are replicated in computational systems, including in how sovereignty and land exploitation are embedded in the digital territory of the information age (Mohamed, Isaac, & Png, cited in Acemoglu, 2021; Lewis et al., 2020; Radin, 2017). Algorithmic coloniality suggests that the dominance and manipulative power of the world’s largest technology corporations mirrors traditional strategies of imperial colonizers (Brevini, 2022, p. 95). While the benefits of technological innovation accelerate economic gains for the privileged elite, Mohamed, Isaac, and Png (cited in Acemoglu, 2021) argue that any pathway to shared prosperity must address colonial legacies and three distinct forms of algorithmic harms: algorithmic oppression, exploitation, and dispossession (p. 61). Doing so is not only consequential for people who identify as being Indigenous; it may provide the tools necessary for intervening in the perpetuation of discrimination, generally (Radin, 2017). This, Lewis et al. (2020) claim, forms a powerful foundation to support Indigenous futurity (p. 65) while injecting artificial intelligence development with new ontologies whose imaginations and frameworks are better suited to sustainable computational futures (p. 6).

Extending from this, an historical approach may also be key to recognizing “non-Western,” alternative ways of knowing and being, including how “non-Western” technology may influence future iterations of artificial intelligence technologies. This is made clear in the Indigenous Protocol and Artificial Intelligence Working Group’s explanation of the potential links between artificial intelligence technologies and both the Octopus Bag – a multisensorial computing device – and Dentalium – tusk-like shells filled with “computational fluid dynamics simulations” (Lewis et al., 2020, pp. 58-69). This approach, however, may present methodological challenges as researchers try to embrace the nourishing aspects of our traditional value systems while still accommodating modernity.

An historical approach may also serve environmental considerations well, especially in the context of the humanities and social sciences. Adequate research on renewability, ecofuturisms, and the environmental costs of artificial intelligence should span the entire production chain, including the historical circumstances in which those “productive” relationships arose. This view is critically important to exposing the environmental effects of technology, while recognizing that both ecological and social precarity caused by technology is not just a timely and urgent idea, but also one with a rich history. Too much recent and short-term research looks at the ecological impacts of artificial intelligence as a “new” phenomenon, rather than one that replicates historical trends albeit through modern consumption rates (which make environmental effects seem historically unique). Informed by the past, environmental research about technology is more likely to prepare users for the future while also shedding light on the ways that we may want and need to act differently in the face of technological change.

An historical approach to studying artificial intelligence may also help us to: 1) re-evaluate the consumptive ideologies underpinning environmental AI discourse; 2) begin to view data as a non-renewable resource; 3) construct a new genealogy of contemporary technological culture that centers bodily subjects; and, 4) perhaps even consider acting against technological progressivism by halting the production of new “innovations” that “datafy” manual or semi-manual sectors and technologies, merely for the sake of it.

These suggestions would challenge the dominance of artificial intelligence technologies, provide different ways to imagine technological innovation and its cultural implications, and re-envision a world that may not rely on technology to solve the most pressing social, environmental, and political questions. These perspectives could also drastically change our view of the relationship between people, energy, and information. Although these considerations may seem radical and aspirational, they are necessary if we want to reorient perspectives in artificial intelligence research and think about the agents – both human and non-human – that are served and impacted by today’s dominant visions for the future of technology.

 

Octopus Bag
Figure 3: Octopus Bag 
Courtesy of https://spectrum.library.concordia.ca/id/eprint/986506/7/Indigenous_Protocol_and_AI_2020.pdf .
 

Utopian and idealistic views of artificial intelligence are justified by a host of corporate, governmental, and civil actors, who have four major reasons for supporting the continued use and development of artificial intelligence:

  1. Leveraging computational speed to make work more efficient;
  2. Appearing to improve the perceived accuracy, fairness, or consistency of decision-making (in light of so-called “human fallibility”);
  3. Similarly, appearing to depoliticize decision-making by placing it out of reach of human discretion; and,
  4. Deploying artificial intelligence technologies to solve pressing environmental issues.

These motivations, especially when replete of historical consideration, have led to an automation bias whereby humans tend to trust computational tools more than they probably should. This raises distinct concerns about oversight and responsibility and about the ability to seek recourse in the wake of computational error. In other words, any motivation to use and deploy artificial intelligence technologies necessarily presses up against regulatory, legal, and ethical questions because, at its core, artificial intelligence can distort peoples’ perception of each other and the structures and systems that govern their lives. This is especially true when such technology is viewed as being inherently modern, rather than merely part of a longer, historical lineage of technological advancement.

Dentalium
Figure 4: Dentalium
Courtesy of 
https://spectrum.library.concordia.ca/id/eprint/986506/7/Indigenous_Protocol_and_AI_2020.pdf

In this sense, studying artificial intelligence with an historical orientation is as much about people, culture, and the world, as it is about the technology itself. Artificial intelligence is people-populated. It is reliant on human bodies and brains. It is dependent on human hands and eyes. It is fueled by us. But technochauvinism and techno-optimism (both inherently modernist ideologies) hinder our ability to see this. Instead, artificial intelligence perpetuates the fantasy of ever-flowing, uninterrupted, and curated streams of information, technological solutionism, and optimism about artificial intelligences’ ability to solve the world’s most pressing questions – as long as it’s designed with “humans in the loop.” This framing, though, limits and constrains human agency and autonomy by positioning humans as a mere appendage to the machine. This view relies only on small tweaks to the current automated present and fails to account for artificial intelligence imaginaries informed by the past that may better address the harms and inequities perpetuated by current artificial intelligence systems.

A strictly modernist approach to artificial intelligence and automation in general has hampered people’s ability to imagine alternatives to artificial intelligence systems, despite overwhelming evidence that the integration of those systems into our everyday lives disproportionately benefits the wealthy elite and creates undue harm to vulnerable groups (Acemoglu, 2021; D’Ignazio & Klein, 2020; Benjamin, 2018; Radin, 2017). This is because, without an historical orientation, it is natural – and easy – to view artificial intelligence as not only representative of the future, but also as actively shaping it by both opening and closing imaginative possibilities of what the world can become with the “help” of new technologies.

Instead, I’d like to draw attention to an alternative vision: what if we resist the urge to build, deploy, and use new computational systems? What if we begin to realize that technology might not be our world’s saviour? What if we choose to slow down and work intentionally and mindfully instead of quickly? These questions are not meant to elide the important computational work currently carried out by and through artificial intelligence systems, including and especially in medical applications and in services that are too dangerous for human actors to perform. Instead, this alternative vision for the future, which is deeply rooted in historicity, simply resists viewing technology as determined, and instead sees it as being determinable. It reorients power in the favour of human agents rather than technological ones.

Perhaps the “AI question” can only be solved when people are empowered to imagine futures beyond the dominance of techno-utopianism. After all, new imaginaries are really mostly dangerous to those who profit from the way things currently are. Alternative futurisms have the power to show that the status quo is fleeting, non-universal, and unnecessary, and although artificial intelligence has changed the world, people have the ultimate power to shape it.


Bibliography

Acemoglu, D. (2021). Redesigning AI: Work, democracy, and justice in the age of automation. Massachusetts: MIT Press.

Benjamin, R. (2019). Race after technology. Cambridge: Polity Press.

Bolter, J. (1984). Turing’s man: Western culture in the computer age. University of North Carolina Press.

Brevini, B. (2021). Is AI good for the planet? Cambridge: Polity Press.

Broussard, M. (2018). Artificial unintelligence: How computers misunderstand the world. Massachusetts: MIT Press.

Chun, W. (2011). Programmed visions: Software and memory. Massachusetts: MIT Press.

Crawford, K. (2021). Atlas of AI: Power, politics, and the planetary costs of artificial intelligence. New Haven: Yale University Press.

Dauvergne, P. (2021). AI in the wild: Sustainability in the age of artificial intelligence. Massachusetts: MIT Press.

D’Ignazio, C., and Klein, L.F. (2020). Data feminism. Massachusetts: MIT Press.

Edwards, P.N. (2003). Infrastructure and modernity: Force, time, and social organization in the history of sociotechnical systems. In Modernity and Technology (eds. Misa, T.J., Brey, P., and Feenberg, A.). Massachusetts: MIT Press.

Frey, C.B. (2021). The technology trap: Capital, labor, and power in the age of automation. New Jersey: Princeton University Press.

Golumbia, D. (2009). The cultural logic of computation. Massachusetts: Harvard University Press.

Gray, M., and Suri, S. (2019). Ghost work: How to stop Silicon Valley from building a new global underclass. Mariner Books.

Haigh, T., and Ceruzzi, P.E. (2021). A new history of modern computing. Massachusetts: MIT Press.

Lewis, J. et al. (2020). Indigenous Protocol and Artificial Intelligence: Position Paper.

Indigenous Protocol and Artificial Intelligence Working Group. https://www.indigenous-ai.net/position-paper/

Pasquale, F. (2020). New laws of robotics: Defending human expertise in the age of AI. Massachusetts: Harvard University Press.

Radin, J. (2017). “Digital natives”: How media and Indigenous histories matter for big data. Osiris, 32(1).

Schwab, K. (2017). The fourth industrial revolution. New York: Penguin.

 

Helen A. Hayes (May 2022). “New Approaches to Critical AI Studies: A Case for Anti-Modernism and Alternative Futurisms.” Interfaces: Essays and Reviews on Computing and Culture Vol. 3, Charles Babbage Institute, University of Minnesota, 45-53.


About the Author: Helen A. Hayes is a Ph.D. Candidate at McGill University, studying the relationship between artificial intelligence, its computational analogs, and the Canadian resource economy. She is also a Policy Fellow at the Centre for Media, Technology, and Democracy at the Max Bell School of Public Policy. She can be reached at helen.hayes@mcgill.ca or on Twitter at helen__hayes.


 

 

JOSEPHINE, THE AVERAGE AMERICAN FEMALE, AND JOE JR., A TYPICAL 6-YEAR-OLD
Figure 1. This is Josephine, the average American female, and Joe Jr., a typical 6-year-old. From the Collection of Smithsonian Institution Libraries.

 

Cybernetics, an intellectual movement that emerged during the 1940s and 1950s, conceived of the body as an informational entity. This separation of the mind and body, and the prioritization of the mind as a unit of information, became a liberating quality as the capitalist world of industrialism, with its mechanical and earthly labor, bound the liberal subject in shackles. The cybernetic subject, in contrast, as “a material-information entity whose boundaries undergo continuous construction and reconstruction,” floated in the permeable ether of information and technology (How We Became Posthuman, 3). Marxist issues of social alienation and scarcity were resolved by the interconnectedness of information-based beings, and hierarchical labor relations were replaced with more communal forms of exchange. A new utopia was thus formed with the advent of digital communication (Brick, 348).

This dematerialized, cybernetic body converged with the creation of technology through the work of the industrial designer Henry Dreyfuss. Dreyfuss, who drafted what can be considered early user personas out of data collected from the military, utilized these imagined bodies for the testing of physical products. Dreyfuss’ designs, or what he labeled as “Joe and Josephine,” quantified the human experience. This model of testing and iterating designs based on dematerialized conceptions of the body was later incorporated into the development of technology by computer scientists such as Ben Shneiderman, who claimed in a 1979 paper that Dreyfuss’ emphasis on the human experience must be considered by engineers and designers. As scholars such as Terry Winograd and John Harwood claim, Dreyfuss’ methodology became the model for user testing that has remained relevant for interaction designers ever since its publication in 1955.

However, as Katherine Hayles argues in How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics (1999), the dematerialized body as conceived of by Dreyfuss is problematic. To put it simply, “equating humans and computers is especially easy” if the mind is both an egoless and informational resource to be shared (How We Became Posthuman, 2). Yet, this sort of epistemology neglects embodied and subjective experiences. Race, class, and gender relations cannot be erased by what she labels the “posthuman,” and while Hayles published her book over two decades ago, this issue is still pressing in the field of design. As Sasha Constanza-Chock describes in their book Design Justice: Community Led Practices to Build the Worlds We Need, a “nonbinary, trans, femme-presenting person,” is unable to walk through an airport scanner without getting stopped because the system has been built to represent “the particular sociotechnical configuration of gender normativity” (How We Became Posthuman, 2). The system, in identifying and classifying the body as information, misses crucial identities. In a paper published in 2018 titled “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification,” authors Timnit Gebru and Joy Buolamwini examined a similar problem of bodily erasure (Buolamwini and Gebru, 2018). Gebru and Buolamwini found that facial recognition systems trained on biased data sets representing faces of mostly white men will, unexpectedly, become biased. The bodies of women and people of color, in this example, are made invisible through their translation into information. As Aimi Hamraie writes in their book Building Access: Universal Design and the Politics of Disability:

Ask an architect about their work, and you may learn about the style, form, materials, structure, and cost of a building than the bodies or minds mean to inhabit it. Examine any doorway, window, toilet, chair, or desk in that building, however, and you will find the outline of a body meant to use it. From a doorframe’s negative space to the height of shelves and cabinets, inhabitants’ bodies are simultaneously imagined, hidden, and produced by the design of built worlds. (Hamraie, 19)

Theme Center - Democracity - Model of towns and countryside
Figure 2. "Democracity" designed by Henry Dreyfuss for the 1939
World's Fair in New York City.  
Courtesy of Manuscripts and Archives Division, The New York Public Library. (1935 - 1945).

Architects, industrial designers, and interaction designers wield power when they craft who they imagine will use their built environments, and when they ignore their own biases, designs are built to reify hegemonic systems. There is thus a larger issue of disembodiment which needs to be researched as it relates to the contemporary methodologies of interaction designers.

The relationship between designers and human bodies has a long history. As Christina Cogdell argues in Eugenic Design, the “scientific possibilities of controlling heredity through environmental manipulation inspired reform-minded architects and designers” during the early twentieth century, specifically (Cogdell, 16). Cogdell finds that early industrial designers such as Dreyfuss were swept up in a movement to “streamline” design much in the same way that eugenicists looked to “streamline” the human body (Cogdell, 52 - 53). Cogdell cites examples such as the 1939 New York World’s Fair which featured Dreyfuss’ work against a backdrop that used streamlining as a medium through to promote democracy (Cogdell, 2004). Events such as these demonstrate that the industrial desire to create “perfect” environments and “perfect” bodies was not unique to the United States. In Eugenics in the Garden: Transatlantic Architecture and the Crafting of Modernity, Fabiola López-Durán argues that Lamarckian evolution was “an invitation to social engineering” for Latin American nations at the turn of the twentieth century (López-Durán, 4). This form of evolutionary theory fostered an “orientation toward environmental-genetic interaction, empowering an apparatus that made race, gender, class, and the built environment critical measurements of modernity and progress” (López-Durán, 4). While Dreyfuss was engaged with this period of industrial design, this paper departs from these histories by situating Dreyfuss within the post-war era. Nevertheless, this paper recognizes that Dreyfuss’ connection to streamlined bodies may have informed his notion of user-testing, and this is an important consideration when reviewing images of Joe and Josephine.

In this essay, I will explore the cybernetic conception of the body as it relates to the development of technology. More specifically, I will argue that user testing practices, conceived within the historical and cultural context of cybernetics, envisioned that any human figure might represent all human figures. However, as examined previously, this perception of the body as universal ignores the subjective, material, and embodied experiences of users, contributing to the biased systems we see today. This proposed paper will begin with an exploration of the cybernetic notion of the body. It will then explore how this concept converged with the advent of user testing practices and the development of user personas, or skeuomorphic designs used for the creation of digital products. It will, lastly, attempt to correct the histories of industrial design and interaction design by reconfiguring the work of Dreyfuss. These efforts will hopefully extend contemporary literature such as the work of Costanza-Chock, Gebru, Buolamwini, and Hamraie, through a re-examination of history.

1950s Cybernetics

Cybernetics emerged as a dominant field in the 1950s through the work of Norbert Wiener and the publication of The Human Use of Human Beings (1950). In The Human Use of Human Beings, Wiener describes a type of communicative society in which humans act as individual, informational units, or automata. These informational, monadic systems relay messages to one another, and through the process of feedback, establish homeostasis. There is thus both a teleological and biological aspect to early descriptions of cybernetics. Like a beehive which has been disrupted, or a flock of geese attempting to take flight, all units must find their place through the interaction and exchange of information with others. This artful dance prioritizes utilitarianism and positivism. The gathering of information through interaction is essential, and in this way, each monad learns to operate as a collective, resisting natural entropic dissolution. The body is thus an extension of and harbor for information. As Weiner writes:

...the individuality of the body is that of a flame rather than that of a stone, of a form rather than of a bit of substance. This form can be transmitted or modified and duplicated...When one cell divides into two, or when one of the genes which carries our corporeal and mental birthright is split in order to make ready for a reduction division of a germ cell, we have a separation in matter which is conditioned by the power of a pattern of living tissue to duplicate itself. Since this is so, there is no absolute distinction between the types of transmission which we can use for sending a telegram from country to country and the types of transmission which at least are theoretically possible for transmitting a living organism such as a human being. (Weiner, 102 – 103)

The mechanisms of the body, and their ability to maintain life and homeostasis, provide inspiration for the natural, organic order of cybernetics, but nothing more. Information, messages, and communication are key, while the embodied experience, insofar as it is not used to relay messages, is inconsequential.

As Katherine Hayles argues in her article “Boundary Disputes: Homeostasis, Reflexivity, and the Foundations of Cybernetics,” this divorce of the body from information was essential in the first wave of cybernetics. Hayles outlines three waves of cybernetics, the first two of which concern our argument here. The first wave from 1945 - 1960 “marks the foundational stage during which cybernetics was forged as an interdisciplinary framework that would allow humans, animals, and machines to be constituted through the common denominators of feedback loops, signal transmission, and goal-seeking behavior” (“Boundary Disputes”, 441 – 467).This stage was established at the Macy conferences between 1946 and 1953 and it was at the Macy conferences, Hayles argues, where humans and machines were “understood as information-processing systems” (“Boundary Disputes”, 442). It is also within this first-wave that homeostasis was perceived as the goal of informational units. Following the chaos and disillusionment of World War II, first-wave cyberneticians found stability to be paramount. The Macy conferences were thus focused on this homeostatic vision.

However, psychoanalytical insight at the conference helped sow ideas for second-wave cybernetics. If man is to be viewed as a psychological subject, in translating the output of one machine into commands for another, he introduces noise into the teleological goal of homeostasis. This reflexive process, or one in which the autonomy of both subjects is to be considered, disrupted the first-wave one-directional model. As Hayles describes, Lawrence Kubie, a psychoanalyst from the Yale University Psychiatric Clinic “enraged other participants [at the conference] by interpreting their comments as evidence of their psychological states rather than as matters for scientific debate” (“Boundary Disputes,” 459). Nevertheless, while the issue of reflexivity may not have won at the Macy conferences, it later triumphed over homeostasis through the work of biologist Humberto Maturana. Maturana rescued the notion of reflexivity by emphasizing that through the rational observer, the black box of the human mind might be quantified. This new feedback process introduced an autopoietic version of reflexivity in which both man and machine might improve through interaction, resolving the threat of subjectivity. Through both waves of cybernetics, cyberneticians instantiated the concept of the body as immaterial.

Designing for People, Joe, and Josephine

The cybernetic body converged with the development of technology in the 1950s through the work of the industrial designer Henry Dreyfuss. Dreyfuss, who was considered to be one of the most influential designers of his time, developed a model for user-testing through skeuomorphic designs which quantified the human experience. While Dreyfuss was not the first to conceive of user testing, he was the first to develop popular user personas. As Jeffrey Yost notes in Making IT Work: A History of the Computer Services Industry, the RAND cooperation’s Systems Research Laboratory conducted a simulation study labeled Project Casey that used twenty-eight students to test man-machine operations for early warning air defense (Yost, 2017). The practice of interviewing early adopters of a system continued into the 1960s in time-sharing projects such as Project MAC in which psychologists such as Ulric Neisser interviewed users about their phenomenological experience with a computer system. It was Dreyfuss, however, who developed pseudo-users that might be used on a wide scale. While command and control computing and human factors researcher demanded testing for specialized users, Dreyfuss aimed, as an industrial designer, to create products for the masses. He therefore looked to craft images of what he deemed lay people for the creation of physical products.

First recognized in his book Designing for People (1955), Joe and Josephine represent Dreyfuss’ perception of the “average'' man and woman. They have preferences and desires, they are employed, and most importantly, they are forms of a Platonic ideal that can be used for testing products. Like cyberneticians such as Maturana, Dreyfuss seems to have recognized the reflexivity between man and machine. Using Joe and Josephine, Dreyfuss tested the interaction between a product and its imagined user in order to improve its usability. Dreyfuss’ book was met with much praise, attesting to the importance of his new model. A review in The New York Times from 1955 titled “The Skin-Men and the Bone-Men” credits Dreyfuss for being a “skin man” who hides the complexity of a mechanism behind its skin (Blake, 1955). In a review from The Nation from the same year, author Felix Augenfeld also credits Dreyfuss for a “his fantastic organization and an analysis of his approach to the many-sided problems the designer must face” (Augenfeld, 1955). Joe and Josephine were thus considered innovative figures upon their publication.

As machine-like entities, Joe and Josephine reflect the discussions of the Macy conferences, and as models for user-testing, they resemble second-wave reflexivity. However, it is unclear what interactions Dreyfuss had with cybernetics during the 1950s. In an article titled “A Natural History of a Disembodied Eye: The Structure of Gyorgy Kepes's ‘Language of Vision’” author Michael Golec describes letters between the cybernetician Gyorgy Kepes and Dreyfuss from the early 1970s (Golec, 2002). Dreyfuss also illustrated a chapter of Kepes’ book Sign, Image, Signal (1966), indicating another touch point between the designer and the cybernetician (Blakinger, 2019). The cybernetician Buckminster Fuller wrote the introduction to a publication by Dreyfuss titled Symbol Sourcebook: an Authoritative Guide to International Graphic Symbols (1972), providing a final touch point between Dreyfuss and cybernetics. Nevertheless, there is no direct evidence that Dreyfuss knew of the Macy conferences, and this question needs more research.

Despite the question of Dreyfuss’ interaction with cybernetics, Dreyfuss’ new model was adopted into cybernetic software and hardware development processes by the 1970s. In a paper by computer scientist Ben Shneiderman titled “Human Factors Experiments in Designing Interactive Systems” (1979), Shneiderman cites Dreyfuss as someone who provides “useful guidance” for the development of computer systems (Shneiderman, 9). Shneiderman also credits Dreyfuss with a user centered approach that prioritizes the friendliness and compatibility of computer systems with their human users. He advocates for “personalizing” the computer by using human testers, and while he does not directly mention Joe and Josephine, he does state that designers should know their users (Shneiderman, 11). Shneiderman, additionally, cites various cybernetic articles, merging Drefyuss with cybernetics once again. This process of crafting personas to test prototypes, outlined by Shneiderman, is a practice which has continued into the present day.

The work of scholars such as John Harwood and Terry Winograd demonstrate the permanence of Joe and Josephine in the history of technology. In The Interface: IBM and the Transformation of Corporate Design, 1945 – 1975, Harwood describes The Measure of Man, a 1959 publication by Dreyfuss which expounded on Joe and Josephine. Harwood finds that The Measure of Man is the primary source for graphic and ergonomic standards within the United States, England, and Canada. He cites that it is “the first and most important, comprehensive collection of human engineering or ergonomic data produced explicitly for architects and industrial designers” (Harwood, 94). Winograd echoes Harwood’s claims in an article titled “Discovering America: Reflections on Henry Dreyfuss and Designing for People.” Winograd notes that Dreyfuss has been a key figure in the creation of courses for Stanford’s d.school, as he is understood as having created the model for empathizing with the user via Joe and Josephine (Winograd, 2008). Both Winograd and Harwood cast back a common perception that Dreyfuss initiated a Kuhnian paradigm shift in the field of design. Through Joe and Josephine, Dreyfuss assisted designers in moving away from the linear development model of Fordism and towards one of circular, iterative, feedback. Yet, it is precisely this heroic view of Dreyfuss that I wish to contest, for although Dreyfuss’ work is significant, Joe and Josephine introduced the use of biased data into product development. Indeed, Winograd makes mention of this flaw when he cites that with Joe and Josephine we must also “keep visible reminders of the subtler and less easily depictable social and cultural differences that determine the compatibility of people with products and interfaces…” (Winograd, 2008). However, I argue there is a deeper issue here which is emboldened by cybernetic theory and hidden in the construction of Joe and Josephine. While Joe and Josephine represent the “average” man and woman according to Dreyfuss, they also reflect his bias as a designer and his inability to recognize the quantified body as subjective.

Drawing 36. The Measure of Man and Woman
Figure 3. Tilley, Alvin and Henry Dreyfuss and Associates. (1993) Drawing 36. The Measure of Man and Woman.

The Designer as World Builder

In tracing the transition from homeostasis to reflexivity, Hayles makes note of a complication which elucidates this issue. In analyzing the work of Humberto Manturana and Francisco Vaerla, two second-order cyberneticians, she finds that Maturana and Varela were system builders that created a system by drawing boundaries to decipher what was to be included inside, and what was out (How We Became Posthuman, 188). As Hayles writes, “Consistent with their base in the biological sciences, Maturana and Varela tend to assume rational observers…Granting constructive power to the observer may be epistemologically radical, but it is not necessarily politically or psychologically radical, for the rational observer can be assumed to exercise restraint” (How We Became Posthuman, 188). The solution to reflexivity conceived in second-order cybernetics is therefore flawed. If the rational observer can quantify the human subject, who is it that edits the observer? An image by computer scientist Jonathan Grudin visualizes this idea. In “The Computer Reaches out: The Historical Continuity of Interface Design,” Grudin sketches the feedback process between the user and the computer (Grudin, 1989). In the image, a computer reaches out to a user, and the user reaches back. The user is also connected to a wider network of users, who reach back to the user, and therefore to the computer as well. In this system, there is an endless chain of interaction between the user/observer, calling into question who is observing whom. As such, no one user can claim to be a world-builder, as they are enmeshed in a socio-material environment.

Dreyfuss, however, claims this title. Joe and Josephine not only represent universal versions of man and woman like Adam and Eve, but they are the “hero” and “heroine” of Designing for People. Yet, as Russell Flinchum writes in the book Henry Dreyfuss, Industrial Designer: the Man in the Brown Suit, a “hodgepodge” of information was interpreted by Dreyfuss’ designer Alvin Tilley to construct Joe and Josephine (Flinchum, 87). Additionally, while the exact reports that Dreyfuss drew from are unclear, we can surmise from which reports he drew. In an oral history with Niels Diffrient, one of Dreyfuss’ designers who later iterated on Joe and Josephine, Diffrient states:

 ...Henry himself had the brilliance, after the Second World War, in which he had done some wartime work of carrying on what he'd learned about human factors engineering...You see, a lot of the war equipment had gotten so complex that people didn't fit into things and couldn't operate things well, like fighter planes, all the controls and everything...So a specialty grew up — it had been there, but hadn't gone very far — called human factors engineering...we found out about these people who were accumulating data on the sizes of people and began to get a storehouse, a file, on data pulled together from Army records, the biggest of which, by the way, and the start of a lot of human factors data, was the information they had for doing uniforms because they had people of all sizes and shapes. (Oral History with Niels Diffrient, Archives of American Art, 2010).

In a later letter written to Tilley, Tilley is asked about the specific type of Army data, helping to track which files Drefyuss may have obtained. The inquirer states that “‘...Douglas Aircraft called to ask if it [The Measure of Man] was available...He asked if the source or sources from which all this data was gathered has been noted’” (Archives of American Art, 2010). Dreyfuss, who had worked on projects for the Vultee Aircraft company during the war, is therefore likely to have used Air Force data as a major source for Joe and Josephine (Flinchum, 1997). A report on anthropometric military processes from the war validates this claim. The report, titled “Sampling and Gathering Strategies for Future USAF Anthropometry” mentions that the work of Francis Randall at Wright Field was an excellent example of proper data collection practices during WWII (Churchill, Edmund, and McConville). Randall’s document, or “Human Body Size in Military Aircraft and Personnel Equipment,” involves countless drawings of fighter pilot dimensions (Randall, 1946). In the book The Measure of Man and Woman, which improved on the designs of Joe and Josephine, Dreyfuss’ team appears to have been inspired by the depictions of fighter pilots in Randall’s work. A comparison of an image of Joe in a compartment with images of fighter pilots demonstrates how closely aligned Dreyfuss was to military practices.

However, Randall’s report also indicates the long-standing practice of classifying and quantifying bodies based on normative standards prevalent within a specific cultural moment. The manipulation of bodies for military data collection practices, and the exclusion of bodies that do not fit a certain “norm,” from these data sets, has a long history that cannot be revisited here, but which indicates that the inspiration for Joe and Josephine was based on biased data. Consequently, the shapes of the Joe and Josephine personas, which influenced heavily both industrial design and computer design practices, represent biased images. There must be continued investigation into which reports Dreyfuss gathered, but it appears likely that he used skewed data to construct his influential designs.

Human Body Size in Military Aircraft and Personal Equipment.
Figure 4. Randall, Francis et al. (1946) Human Body Size in Military Aircraft and Personal Equipment. Dayton, OH: War Department, Army Air Forces, Air Material Command.

Dreyfuss Today

It is difficult to measure the outcome of such flawed practices, but the work of Dreyfuss has resonated throughout the century. The ripple effect of Joe and Josephine, and the countless products drafted from these designs, brings forth a new variable to consider in the construction of digital products. This paper is therefore a response to the many accounts which have canonized Dreyfuss within the history of industrial design, and consequently, the history of interaction design. As demonstrated through the reference to Winograd, Dreyfuss’ efforts as are taught in the classroom. However, through the conception of both real and imagined spaces, designers envision an ideal user, and this user can either represent the multiplicity of complex, messy, and beautiful bodies, or it can represent a “universal” ideal which never truly existed. Tracing the genealogy for these imagined users to their origins is essential for improving the testing practices of our modern moment.


Bibliography

Augenfeld, F. (1955, August 6). Masterpieces for Macy's. The Nation.

Blake, P. (1955, May 15). The Skin Men and the Bone Men. The New York Times.

Blakinger, J. R. (2019). Gyorgy Kepes: Undreaming the Bauhaus. Cambridge, MA: The MIT Press.

Brick, H. (1992). Optimism of the mind: Imagining postindustrial society in the 1960s and 1970s. American Quarterly, 44(3), 348. doi:10.2307/2712981

Buolamwini, J., & Gebru, T. (2018). Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification. Conference on Fairness, Accountability, and Transparency, Proceedings of Machine Learning Research.

Churchill, E., & McConville, J. T. (1976). Sampling and data gathering strategies for future USAF anthropometry. Wright-Patterson Air Force Base, OH: Aerospace Medical Research Laboratory.

Cogdell, C. (2010). Eugenic design: Streamlining america in the 1930s. Philadelphia, PA: University of Pennsylvania Press.

Costanza-Chock, S. (2020). Design Justice. Cambridge, MA: The MIT Press.

Dreyfuss, H. (1976). Measure of Man. Watson-Guptill.

Dreyfuss, H. (2012). Designing for People. New York, NY: Allworth Press.

Dreyfuss, H. (2014). Posters, The Measure of Man (Male and Female) [Cooper Hewitt Design Museum]. Retrieved 2022, from https://collection.cooperhewitt.org/objects/51497617

Erickson, T., Winograd, T., & McDonald, D. (2008). Reflections on Henry Dreyfuss, Designing for People. In HCI Remixed: Essays on Works That Have Influenced the HCI Community. Cambridge, MA: MIT Press.

Flinchum, R. (1997). Henry Dreyfuss, Industrial designer: The man in the brown suit. New York: Rizzoli.

Golec, M. (2002). A Natural History of a Disembodied Eye: The Structure of Gyorgy Kepes's Language of Vision. Design Issues, 18(2), 3-16. doi:10.1162/074793602317355747

Grudin, J. (1989). The Computer Reaches Out: The Historical Continuity of Interface Design. DAIMI Report Series, 18(299). doi:10.7146/dpb.v18i299.6693

Hamraie, A. (2017). Building access: Universal design and the Politics of Disability. Minneapolis, MN: University of Minnesota Press.

Harwood, J. (2016). Interface: IBM and the Transformation of Corporate Design, 1945-1976. Univ Of Minnesota Press.

Hayles, N. K. (1994). Boundary disputes: Homeostasis, reflexivity, and the foundations of Cybernetics. Configurations, 2(3), 441-467. doi:10.1353/con.1994.0038

Hayles, N. K. (2010). How we became posthuman: Virtual bodies in cybernetics, literature, and Informatics. University of Chicago Press.

López-Durán, F. (2019). Eugenics in the garden: Transatlantic architecture and the crafting of modernity. Austin, Texas: University of Texas Press.

Oral history interview with Niels Diffrient. (2010). Retrieved March 7, 2022, from https://www.aaa.si.edu/collections/interviews/oral-history-interview-niels-diffrient-15875

Randall, F. E. (1946). Human Body Size in Military Aircraft and Personal Equipment. Dayton, OH: Army Air Forces Air Material Command.

Randall, F. E. (1946). Human body size in military aircraft and personal equipment. Dayton, OH: Army Air Forces Air Material Command.

Tilley, Alvin and Henry Dreyfuss and Associates. (1993) Drawing 36. The Measure of Man and Woman.

Shneiderman, B. (1979). Human Factors Experiments in Designing Interactive Systems. Computer, 12(12), 9-19. doi:10.1109/mc.1979.1658571

Vultee Aircraft, Inc., military aircraft. (n.d.). Retrieved March 7, 2022, from https://www.loc.gov/item/2003690505/.

Wiener, N. (1967). The Human Use of Human Beings: Cybernetics and Society. New York, NY: Avon Books.

Yost, J. R. (2017). Making IT Work: A History of the Computer Services Industry. Cambridge, MA: MIT Press.

 

Caitlin Cary Burke (March 2022). “Henry Dreyfuss, User Personas, and the Cybernetic Body.” Interfaces: Essays and Reviews on Computing and Culture Vol. 3, Charles Babbage Institute, University of Minnesota, 32-44.


About the author: Caitlin Burke is a Communication PhD student at Stanford University, where she studies user experience design, design ethics, media history, and human-computer interaction.


 

 

Figure 1: Queen Elizabeth II touring Burroughs Strathleven factory 1953.  Courtesy of Charles Babbage Institute Archives.
Figure 1: Queen Elizabeth II touring Burroughs Strathleven factory 1953.
Courtesy of Charles Babbage Institute Archives.

 

The first finding is that long before computers, the Internet, or social media became available, people on both sides of the Atlantic were heavily dependent on organized (usually published) information on a regular basis. Studies about the history of the week, children’s education, work-related activities, and devotion to religious and community practices have made that clearly obvious. The challenge for historians now, therefore, is to determine how best to catalog, categorize, and study this sprawling subject of information studies in some integrative rationale fashion. Do we continue to merely study the history of specific pieces of software or machinery, ephemera such as newspapers and books, or providers of information such as publishers and Google?

A Framework for Studying Information’s History

In my three-volume Digital Hand (2004-2008) and subsequently in All the Facts: A History of Information in the United States Since 1870 (2016), I shifted partially away from exploring the role of providers of information and its ephemera toward how people used facts—information and data. In the research process, categories of everyday information began to emerge. First, periods—think epochs, eras—did too. Second, as with most historical eras these overlapped as well, signaling a changing world.

The same held true for the history of the types and uses of information, and, of course, the technologies underpinning them. We still use landlines and smartphones, we still fill out paper forms asking for information requested for a century and fill out online ones, and of course use First Class mail and e-mail. Publisher’s Weekly routinely reports that only 20 percent of readers consume some e-books; 80 percent of all readers still rely on paper books, so old norms still apply. Apple may post a user manual on its website but buy an HP printer and you are likely to find a paper manual in its shipping box.

All the Facts reported that there were types of information ephemera that existed from the 1800s to the present, supplemented by additional ones that did not replace earlier formats. Obvious new additions were electrified information, such as the telegraph, telephone, radio and TV. Paper-based information was better produced by people using typewriters, better quality pens and pencils and stored in wood later metal file cabinets, 3 x 5 cards, still later computers, PCs, smartphones, and now digital doorbell databases. Each improvement also made it more flexible to store information in logical ways, such as on 3 x 5 cards or folders.

A side table in American living room, ca. 1930s
Figure 2: A side table in American living room, ca. 1930s.

The volume of their use grew massively; humble photographs of the interiors of homes and offices taken over the past 150 years illustrates that behavior as does the evolution of the camera, which too is an information-handling device. Commonly used ephemera across the entire period include, specifically, newspapers, magazines, books, telegraphy, telephones, radios, television, personal computers, smartphones and other digital devices, all arriving in that order. So, any chronology or framework should take into account their use. If you are reading this essay in the 2020s, you are familiar with the myriad uses to which you have relied on information and the appropriation of these devices with the probable exception of the telegraph, which passed into history by the early 1970s.

A second category of activities that any framework needs to incorporate, because they remained constant topics of concern across the entire two centuries, concerns information people needed with which to lead their personal lives, such as medical information to cure illnesses, political information to inform their opinions and voting practices, and so forth. Historians better understand that work-related activities required massively increased uses of information to standardize work processes, run ever-larger organizations, and to provide new products and services. I, and others, continue to study those realms of information use, because they kept evolving and expanding across the past two centuries—a process that shows no signs of slowing. The historical evidence points, however, to several categories of information evident in use in this period in private life. These include consulting published—I call it organized—information on taking care of one’s home and raising children, sports and hobbies, vacations, and interacting with one’s church, community and non-profits organizations, and with government agencies at all levels. Participation in civic and religious institutions, in particular, represented a “growth industry” for information across the two centuries. Sales volumes for books and magazines provide ample evidence of this, just as today sales statistics do about PCs and smartphones the same. People also relied on information available in public spaces. These included public libraries, billboard advertisements and government signs and messages along roads and highways, both painted and digitized, advertisements on the side of buildings, a massive increase in the use of maps available from publishers, state highway departments, and as metal signs on roads. Users worried about privacy issues, a concern expressed in North America as early as the 1600s and still with us today.

Role of the Internet

But what about the Internet? By now the reader will have concluded that everything mentioned already had rapidly migrated to the Internet too, certainly by the very early 2000s. We have already created frameworks for phases in the development and use of the Internet, such that we accept as 1994-1996 as phase one of wide use (adoption or appropriation), 1997-1998 as a second phase with the ability to conduct interactive information exchanges, a third with the introduction of feedback loops that began in about 2002-2003, and yet another involving the adoption of social media applications soon after. Each had its applications: Phase 1 with product brochures, mailing addresses, telephone numbers and some e-mail; Phase 2 intranets, databases, order taking, and organizational news; Phase 3 seeking feedback, customer engagement, business partner collaboration, and in Phase 4 posting of personal information (remember the photographs of cats on Facebook?), communities of practice and customers sharing information, including churches, civic organizations and clubs, and the rise of information analytics. Historical data documented the rapid diffusion of these practices, such that over half the world today uses the Internet to share information (more on that later). Usage began a new central facet of people’s daily lives.

Because we are discussing the Internet’s use, the two most widely sought-after Internet-sourced information in its early stages that continue to the present is political information and even more about pornography and health. Increasingly, too, people seek out games and always “how to” advice. Libraries became spaces one could go to for access to the Internet.  Starting in 2007, people across the word were able to access information quicker and more often than before due to the introduction of the smartphone. 

In All the Facts we published a photograph of a public library in San Antonio, Texas, from 2013 that had no books; rather, it looked like an Apple Store with rows of screens. Today, such spaces are common in most public, school and university libraries in scores of countries. Increasingly since the early 2000s, people received growing amounts of news from sites on the Internet and today, news aggregators pull that together by a user’s preferences for topics and timelines. Religion and raising children are widely covered by information sources on the Internet. In fact, by about 2015 the public expected that any organization had to have a presence on the Internet: civic organizations, every government agency one can imagine, schools, universities, industry and academic associations, stores (including brick-and-mortar versions), political parties, clubs, neighborhood associations, and even children’s playgroups. I found few exceptions to this statement when writing All the Facts.

Historians began to catalog the types of information that became available from these organizations, beginning in the 1950s. Following the lead of librarians who had started the process that we are familiar with today in the 1800s, these included types of ephemera (e.g., books, magazines, journals) and by topics (e.g., physics, history, economics). Historians are now beginning to go further, such as William Aspray and I with our current research about the types of fake information and their impact on truth and authenticity, others exploring what information people rely upon sought through the Internet, and too, how people use information on social media.

As to category of information: for example, by 2014 the Ford Motor Company was providing online information about the company, news, its products, role of innovation, people and careers, media postings, space for customer feedback, contact addresses, stock data and investor facts, a social media sub-site, facts about customer support, automotive industry facts, and declarations about privacy policies. Meticulously documenting these categories of information for thousands of such organizations demonstrates the diversity—and similarity—of types of information that one came to expect. Note, however, that the information and functions cataloged about Ford had been available in paper-based forms since the 1920s, just not as easily or quickly accessible.

Figure 3: Nurse using punch cards (date unknown).
Figure 3: Nurse using punch cards (date unknown).

Returning to the pre-Internet era, think in terms of eras (phases) by going beyond when some form of ephemera became available. The ephemera or technologies themselves added diversity, convenience, speed, less expensive communications, and capability of moving ever-increasing volumes of information. Historians have done considerable research on these five features. However, information historians are just beginning to realize that by focusing their concerns on the information itself, pivoting away from the technologies themselves (e.g., books and computers) they see the endurance of some topics—access to medical information, other facts about raising children, or cooking recipes—regardless of format or technology used.

Thinking this way expands our appreciation for the extent of a society’s use of information and just as relevant, how individuals did too. In a series of books produced by Aspray, one could see how data-intensive the lives of people of all ages, socio-economic status, and interests became over time. I have argued in All the Facts and elsewhere that this kind of behavior, that is to say, ever-increasing reliance on organized information, had been on the rise since the early 1800s.

Recent Findings and Thinking

While All the Facts lays out the case for how we could come to the conclusion that we lived in yet a second information age—not THE Information Age of the post World War II period—that book was published in 2016 and so much has happened since then. Rapid changes in realities facing historians of information keep pounding the shores of their intellectual endeavors on three beaches: Internet usage, fake news and misinformation, and the changing forms of information.

In 2021 the Pew Research Center reported that 95 percent of American adults living in urban centers used the Internet, 94 percent of suburban and 90 percent of rural residents. In comparison to usage in 2015, when writing All the Facts wrapped up, urbanites were at 89 percent, suburbanites at 90 percent, and rural residents at 81 percent. Since 2000, users doubled as a percent of the total population. The overall number of Americans using the Internet in 2021 had reached 93 percent of population. Smartphone usage also increased, now one of the top drivers of Internet usage, thanks to both the increased availability and affordability of this technology. Similar overall statistics could be cited for other OECD, Asian, and South American societies. Convenience and affordability combined are driving use all over the world, no longer just in the wealthiest societies.

Other surveys conducted in the United States by the Pew Foundation reported that over 70 percent of residents thought the information they obtained was accurate and trustworthy in 2012, just before the furor over misinformation became a major issue of concern in American society expressed by both the politically energized Right and Left, and by students of misinformation and many in the media and in senior government positions. But the types of information people sought were the same as in prior decades.

The problems survey respondents expressed emanated from where fake news or misinformation resided. First, fake news and misinformation was not constrained to sources on the Internet; these appeared in books, television programs, magazines, and radio programs, often promulgated by agents operating across multiple digital and paper-based platforms. Information scholars are increasingly turning their attention to this problem, as have Aspray and I, reporting our results in a series of books and papers. However, as he and I have emphasized and documented, this has been a concern and overt activity since the eighteenth century.

In Fake News Nation (2019) we focused largely on political and industry-centered examples, while in a sequel, From Urban Legends to Political-Fact Checking: Online Scrutiny in America (2019) we began documenting the nation’s response to this growing problem. The physical world’s battles over politics and such issues as the role of tobacco, oil, and environmental damage had moved to the Internet, but also represented terrain fought over long before the use of the web. If anything, the role of misinformation has spilled over into myriad issues important to today’s citizens: health, vaccines, historical truths, racism, product endorsements and descriptions, among others. Trusted sources for impartial news and information competed for attention with mischievous purveyors of misinformation and people at large opining on all manner of subjects. These activities disseminating false or misleading information represent a new development of the past decade because of their sheer volume of activity, even though their patterns are becoming increasingly familiar to historians studying earlier decades, even centuries.

But perhaps for historians the most interesting new research interest is the nature of how information changes. To make All the Facts successful, it was enough and highly revelatory to document carefully the existence, extent of, and use of information across essentially all classes, ethnic and racial groups and ages, and to present a framework for gaining control over what otherwise were massive collections of organized information. That exercise made it possible to argue that modern society (i.e. since at least the start of the Second Industrial Revolution) had to include on any short list of research priorities the role of information in all manner of activity. During that project, it became evident, however, that information itself (or, what constituted information) was changing, not simply increasing or becoming more diverse and voluminous. Second, that transformation of information and new bodies of fact were leading to the emergence of new professions and disciplines, along with their social infrastructures, such as professorships and associations and literature.

IBM's Type 070 vertical sorters, ca. 1910s.
Figure 4: IBM's Type 070 vertical sorters, ca. 1910s.
Courtesy of IBM archives.

For example, regarding changing information, it became increasingly electrified, beginning with the telegraph in the 1840s and today the “signals” of which computer scientists and even biologists explore. There are biologists and other scientists who argue that information is a ubiquitous component in the universe, just as we have accepted that same idea regarding the presence of energy. Intelligence could no longer be limited to the anthropomorphic definitions that humans had embraced, emblematically called artificial intelligence. Trees communicate with each other, so do squirrels and birds about matters relevant to their daily lives.

Regarding the second point—development of new professions—before the 1870s there was insufficient knowledge about electricity to create the profession of electrician, but by the 1880s, it existed and rapidly developed its own body of information, professional practices, and rapidly became a licensed trade. In the years that followed, medical disciplines, plumbing, accounting, business management, scientists in all manner of fields, even later airplane pilots, radio engineers, and astronauts became part of modern society. They all developed their associations, published specialized magazines and journals, held annual conventions and other profession-centered meetings, and so forth. Probably every reader of this essay is a product of that kind of transformation.

Prior to the mid-nineteenth century, most professions had been relatively stable for millennium, such as the percent of populations engaged in subsistence agriculture, law, religion, warfare, and the tiny cohort of artisans. That reality has been thoroughly documented by economic historians, such as by Angus Maddison in his voluminous statistical collections (2005, 2007), all of whom pointed out that national income levels, for example, or increase in both economic productivity and population that radically did not change until more and different information began arriving. This was not a coincidence.

Understanding how information transformed and its effects on society is a far more important subject to investigate than what went into All the Facts because, like the investigations underway about misinformation, we are reaching into the very heart of how today’s societies are shaped and function. The earlier book was needed to establish that there was a great deal more going on that could be communicated by historians than by limiting their studies to the history of books or newspapers, or to the insufficient number of studies done about academic and discipline-centered institutions.

Man looking at punched paper tape ca. 1960.
Figure 5: Man looking at punched paper tape ca. 1960.
 Courtesy of Charles Babbage Institute Archives.

Now we will need to explore more carefully how information changed. I propose that this be initially done by exploring the history of specific academic disciplines and the evolution of their knowledge base. That means understanding and then comparing to other disciplines the role of, for instance, economics, physics, chemistry, biology, history, engineering, computer science, and librarianship. This is a tall order, but essential if one is to understand patterns of emerging collections of information and how they were deployed, even before we can realistically jump to conclusions about their impact. Too often “thought leaders” and “influencers” do that, in the process selling many books and articles but not with the empirical understanding that the topic warrants.

That is one opinion about next steps. Another is that the democratization of information creation and dissemination is more important. The argument holds that professionals and academics are no longer the main generators of information, millions of people instead. There are two problems with this logic, however. First, such an observation is about today’s activities, while historians want to focus on earlier ones, such as information generation prior to the availability of social media. Second, there is a huge debate underway about whether all of today’s “information generators” are producing information, misinformation, or are simply opining. As a historian and an avid follower of social media experts, I would argue that the issue has not been authoritatively settled and so the actions of the experts still endure, facilitated by the fact that they control governments, businesses, and civic organizations.

I am close to completing the first of two books dealing precisely with the issue of how information transformed. It took me 40+ years of studying the history of information to realize that understanding how it changed was perhaps the most important aspect of information’s history to understand. That realization had been obfuscated by the lack of precision in understanding what information existed. We historians approached the topic in too fragmented a way; I am guilty, too, as charged. But that is not to say that the history of information technology—my home sub-discipline of history and work—should be diminished, rather that IT’s role is far more important to understand, because it is situated in a far larger ecosystem that even transcends the activities of human beings.


Bibliography

Aspray, William (2022). Information Issues for Older Americans. Rowman & Littlefield.

Aspray, William and James W. Cortada (2019). From Urban Legends to Political Factchecking. Rowman & Littlefield.

Aspray William and Barbara M. Hayes (2011). Everyday Information. MIT Press.

Bakardjeva, Maria (2005). Internet Society: The Internet in Everyday Life. Sage.

Blair, Ann, Paul Duguid, Anja Silvia-Goeing, and Anthony Grafton, eds. (2021). Information: A Historical Companion. Princeton.

Chandler, Alfred D., Jr. and James W. Cortada, eds. (2002). A Nation Transformed by Information. Oxford.

Cortada, James W. (2016). All the Facts: A History of Information in the United States Since 1870. Oxford.

Cortada, James W. (2021). Building Blocks of Society. Rowman & Littlefield.

Cortada, James W. (2004-2008). The Digital Hand. Oxford.

James W. Cortada (2020). Living with Computers. Springer.

James W. Cortada (2002). Making the Information Society. Financial Times & Prentice Hall.

Cortada, James W. and William Aspray (2019). Fake News Nation. Rowman & Littlefield.

Gorichanaz, Tim (2020). Information Experience in Theory and Design. Emerald Publishing.

Haythornwaite, Caroline and Barry Wellman, eds. (2002).  The Internet in Everyday Life. Wiley-Blackwell.

Maddison, Angus (2007). Contours of the World Economy, 1-2030 AD. Oxford.

Maddison, Angus (2005). Growth and Interaction in the World Economy: The Roots of Modernity.

Ocepek, Melissa G. and William Aspray, eds. (2021). Deciding Where to Live. Rowman & Littlefield.

Zuboff, Shoshanna (2019). The Age of Surveillance Capitalism. Public Affairs.

 

James W. Cortada (February 2022). “What We Are Learning About Popular Uses of Information, The American Experience.” Interfaces: Essays and Reviews on Computing and Culture Vol. 3, Charles Babbage Institute, University of Minnesota, 19-31.


About the authorJames W. Cortada is a Senior Research Fellow at the Charles Babbage Institute, University of Minnesota—Twin Cities. He conducts research on the history of information and computing in business. He is the author of IBM: The Rise and Fall and Reinvention of a Global Icon (MIT Press, 2019). He is currently conducting research on the role of information ecosystems and infrastructures.


Editors’ note: This is a republication of an essay (the second one) from a newly launched blog of essays Blockchain and Society by CBI Director Jeffrey Yost. As a one-time only crossover at the launch of the blog and site, Interfaces is republishing an essay Yost wrote on gender inequity and disparity in participation in the development and use of cryptocurrency. This one-time republication is to introduce Interfaces readers to the blog and its topic is an especially good fit with Interface’s mission. Please consider also subscribing to the blog https://blockchainandsociety.com/ 

 

Few Women on the Block: Legacy Codes and Gendered Coins

Jeffrey R. Yost

Abstract: Despite major differences in levels of participation in computing and software overall, the decentralized cryptocurrency industry and space is far more skewed with roughly 90 percent men and 10 percent women (computer science overall is around 20 percent women). This article explores the history of gender in computing, gender in access control systems, gender in intrusion detection systems, and the gender culture of Cypherpunks to historically contextualize and seek to better understand contemporary gender disparity and inequities in cryptocurrency.

PDF version available for download.

 

Given decentralization is at the core of the design and rhetoric of cryptocurrency projects, the field often highlights, or hints at, small to mid-sized flat organizations and dedication to inclusion. Crypto coin and platform projects’ report cards on diversity, however, are uneven. While an overall diversity of BIPOC exists in cryptocurrency, it is quite unequal, as the founding and leadership of Bitcoin (team, the creator is anonymous) and the top 30 altcoins (alternative to Bitcoin) is disproportionately white North Americans, Europeans, and Australians, along with East Asians. With gender, inequalities are especially prevalent, in participation and resources. A half dozen or so surveys I found, spanning the past few years, suggest (in composite) that women’s participation in the crypto workforce is at slightly less than 10 percent. There are few women on the block, far fewer percentagewise in cryptocurrency than the already quite gender skewed low ratios in computing and software. On the adoption side, twice as many men own cryptocurrency as women.

This essay, on women in cryptocurrency, concentrates on gender inequities, as well as intersectionality. It discusses early research in this area, standout women leaders, and organizational efforts to address gender imbalances and biases. It places this discussion in larger historical contexts, including women in computing, women in security, women in cryptography, and women in, or funded by, venture capital. It also highlights the rare instances of a female CEO of cryptocurrency. Achieving greater gender balance is a critically important ethical issue. It also is good business. Many studies show corporations with gender balance on boards and women in top executive positions outperform. My essay posits that historical, terminological, spatial, and cultural partitions and biases block gender inclusion and amplify inequality in cryptocurrency development, maintenance, and use.

Major Gender Disparities in Cryptocurrency

A major study by international news organization Quartz surveyed the 378 cryptocurrency projects between 2012 and 2018 that received venture capital funding (Hao, 2018). Many cryptocurrency projects do not have this luxury or take this path, as they raise funds from friends and family, bootstrap, or rely on other means at the start. Venture capital funded projects tend to have greater resources and key connections to grow. Most of the largest coin projects have taken on venture capital support at some point in their young histories. It is self-reinforcing as rich projects tend to grow richer through R&D and marketing, and through the momentum of network effects, Metcalf’s Law (Value of a Network = N of Users-Squared), and under-resourced coin projects, often cease within several years as capitalizations descend to near $0.

Of these 378 venture funded projects, only 8.5 percent had a woman founder or co-founder. Venture capital (VC) is dominated by men, about 90 percent, and in terms of partners and senior positions at major VC firms, disparities are even starker (as reported in NBER Digest 9/2017). The venture domain is also very heavily biased toward funding of projects of white male entrepreneurs, and this is even more skewed in terms of capital offered or deployed. To illustrate, a study by the software and finance data firm Pitchfork found that in 2018 women founders received just 2.3 percent of total venture capital funding raised in the crypto universe—reported on by Forbes (Gross, 2019).

In the information technology (IT) field broadly, roughly 18 percent of projects have a woman leader or co-leader. Even with this quite low percentage in IT, crypto is disturbingly far lower than this, in fact, it is well less than half of that level.

On the adoption and use side, and unlike with BIPOC, where adoption nearly doubles that of whites in the US (participation rate of owners, at any level, so this is not crypto wealth), women holders of crypto are only half that of men. Men are two-thirds of crypto holders/users and women are just one-third.

Looking Backward at Backward, Gendered Computing Cultures

: Women plug board programming
Photo credit
Charles Babbage Institute/CBIA, University of Minnesota Libraries.
Figure 1: Women plug board programming,

Computing is a field that has had substantial and important technical contributions by women from the start. This dates to the women who programmed the ENIAC, the first meaningful electronic digital computer, in mid 1940s to early 1950s. At the same time, the field and the industry has been held back by discrimination in hiring, and there have been heavily male gendered environments from the beginning. This has been true in the U.S. as documented in the tremendous scholarship of Janet Abbate (Abbate, 2012) and others, and for the United Kingdom, in the masterful work of Mar Hicks (Hicks, 2017).

Gender in IT remains substantially understudied, especially in some geographies. There is also a dearth of literature regarding some industry segments, corporations, and organizations on the production side, as well as much in the maintenance and use domains. Discriminatory practices against women and transgendered people have been and remain pronounced in the military, national and local governments, industry, national laboratories, nonprofits, universities, and beyond.

Thomas Misa’s path breaking, deeply researched work, published in Communications of the ACM and part of a larger book project, indicates there was not a golden age of women's participation in the early years, but continuously evident but steady low and range bound participation--between the high single digits to the upper teens percentwise--from the middle 1950s to the middle 1970s (Misa, 2021). His research draws on the best available data for the early years, user groups (and for the above I am giving extra weight to IBM’s User Group SHARE, Inc. in combining Misa’s totals for groups since it was 60 percent plus of the industry and its nearest competitor was always under 15 percent). Following this two-decade span, was a gradual upward trend that ramped up in the 1980s when late in the decade women's participation in computer science education and the workforce peaked at 37 to 38 percent. In 1990s it fell sharply, as Misa, and other authors, explored in his important edited volume, Gender Codes (2010).

Participation, environment, culture, and advancement are all important. My own work has contributed to show gender inequality in promotion to leadership roles in software and services companies in the US, especially pre-2000 (Yost, 2017). In recent years and decades, women's participation as computer science majors at US universities has been hovering around 20 percent. Why the huge drop and recovery to only about half the former peak? The widespread adoption of PCs, gendering of early PCs, gendered gaming (especially popular shooter games), rise of male geek culture, inhospitable environments for women are among the likely factors, as the publications of Abbate, Hicks, Misa, and others richly discuss. More attractive opportunities in law, medicine, and business outside IT likely are factors too, as participation in these areas rose as computing participation fell. And far from being free of discrimination, on a relative basis, these professional areas may have had less.

Gender in Different Computer Security Environments

In co-leading a major computer security history project for the National Science Foundation (NSF Award Number: 1116862) a half decade ago (and I am thrilled, just yesterday, we received another multiyear grant from NSF, on privacy and security, a CBI project I am co-leading with G. Con Diaz), I published “The March of IDES: A History of Intrusion Detection Expert Systems.” (Yost, 2015). I highlighted gender in one important area of computer security, intrusion detection. Early intrusion detection involved manual printing out logs and painstaking review of the printouts as security officers, auditors, and systems administrators' (who did this work) eyes glazed over. As computer use grew, fan folded printouts would grow in multiple stacks toward the ceiling at many computer centers, it soon overwhelmed. In the 1980s automated systems were developed to flag anomalies to then be selectively reviewed by humans. In the 1980s the artificial intelligence of expert systems was first applied in pioneering work to help meet growing challenges (Yost, 2016).

Professor Dorothy Denning, Naval Postgraduate, 2013. Led IDES at SRI.
Professor Dorothy Denning, Naval Postgraduate, 2013. Led IDES at SRI.

The National Security Agency had a very important pioneering research program in the 1980s and 1990s to fund outside intrusion detection work, called Computer Misuse and Anomaly Detection, or CMAD. This program was led by Rebecca Bace. The dollar amounts were not huge, they did not need to be, and Bace, with great energy and skill, expertly worked with the community to get much pioneering work off the ground toward impactful R&D, at universities, national labs, and nonprofit research corporations like SRI. In conducting oral histories with Dorothy Denning, Teresa Lunt, and Becky Bace (full text of these published interviews are at the CBI Website/UL Digital Conservancy), I got a sense of the truly insightful scientific and managerial leadership of the three of them (Yost, 2016).

The accelerating, sometimes playful, but also quite malicious and dangerous hacking of the 1970s and 1980s (for those Gen Xers and boomers reading this, remember War Games, and some non-fictional scares written about in newspapers?) became a serious problem. The US Government often was a core target of individual and state sponsored hackers in the Cold War. This fostered a need (and federal contracts) for this field of intrusion detection systems. As such, and over time, increasingly there were funds and contracts to complement the modest grant sizes, often under $100,000, provided from Bace’s NSA (CMAD) program.

This resulted in essentially a new computer science specialty opening in the 1980s and 1990s at universities, a subset of computer security, intrusion detection. There were some standout male scientists also, but at the origin, and for years to follow, women computer scientists disproportionately were the core intellectual and project leaders. Women scientists such as Denning, Lunt, Bace, as well as Kathleen Jackson (NADIR at Los Alamos) and other women scientists headed the top early projects and provided the insightful technical and managerial leadership for this computer security and computer science specialty to thrive (Yost, 2016).

Rebecca "Becky" Bace (1955-2017).
Rebecca "Becky" Bace (1955-2017).

Another computer security area I researched for NSF was access control systems and standards. This was all about knowing how operating systems worked, secure kernels, etc. It was by far the largest computer security field in terms of participants, papers, funding, and standard setting efforts, and it was overwhelmingly male. Operating systems (OS) was an established research area prior to access control becoming a key domain in it. Access control as an area within the larger OS domain was in response to breeches in the earliest time-sharing systems in government and at universities. MIT’s early 1960s pioneering Compatible Time-Sharing System (CTSS) had little security, with its successor of the late 1960s and beyond, MULTICS, project leader Fernando Corbato, and other top computer scientists at MIT like Jerome Saltzer, made security design central to the effort.

Operating systems research and development, in academia, industry, the DoD, DOE, etc. was overwhelmingly male and very well-funded. It followed that access control became an overwhelmingly male specialty of computer security and received strong federal research program and contract support.

Reflecting on this prior scholarship—women as the key leaders of the new (1980s) intrusion detection area and men the leaders of many of the most important operating system and access control projects—I have been pondering whether it provides any context or clues as to why, to date, the founders of cryptocurrency projects have largely been men? At very least I think it is suggestive regarding established and new specialties, connections between them, historical trajectories, and gender opportunities and participation. A wholly new area, when a dominant more visible and better funded other area exists, can lead to greater opportunities at times for newcomers to the new area of security, including for women.

Following from this, I have begun to consider the related question of: to what extent is cryptocurrency a new area offering new demographics and dynamics, and to what extent is it a continuation of the evolving field of cryptography? And how was this influenced by older cryptography and its renaissance in impactful new form, its new direction?

In the mid-1970s and 1980s with the emergence and rapid growth of a new form cryptography, public key developed a strong intellectual and institutional foundation, especially thanks to the work six men who would later win the Turing Award, early crypto pioneers Whitfield Diffie and Martin Hellman (and the “New Directions…” 1976 landmark paper); Ron Rivest, Adi Shamir, Leonard Adleman, the three from MIT known as RSA; and Silvio Micali, also MIT. Rivest, Shamir, and Adelman in addition to the RSA Algorithm would start a company RSA Data Security, and it would launch a pivotal event, the RSA Conference, and spin off an important part, authentication, as Verisign, Inc. After some initial managerial and financial stumbles, highly skilled James Bidzos would successfully lead RSA Data Security, and as Chair of the Board, Verisign.

In addition to his Turing Award, Micali had earlier won the Gödel Prize. In 2017, Micali became the founder of a now more than $10 billion “Proof-of-Stake” altcoin project Algorand and along with running this, he is a Computer Science Professor at MIT. Algorand offers much in being environmentally sound (low energy to mine), scalable, and possesses strong security.

Cryptocurrency: Both a New and an Older Space

The excellent book by Finn Brunton, Digital Cash (2019) and other articles and books addressing the cypherpunks—the cryptographic activists focused on privacy who sought to retake control through programming and systems—overwhelmingly have male actors. In addition to Diffie and Hellman, appropriately revered for inventing public key (in the open community), most of the high profile cypherpunks are male—Timothy May, Eric Hughes, John Gilmore, etc.

Yet it was one of the co-founders, Judith Milhon, known as “St. Jude,” who coined the term Cypherpunk. The cypherpunks, who journalist Steven Levy referred to as the “Code Rebels” in his book Crypto, were inspired in part by the work of Diffie and Hellman. The response of the National Security Agency (NSA) was to try to prevent private communications it could not surveille, and thwart or restrict development and proliferation of crypto it could not easily break. This included its work with IBM to keep the key length at a lower threshold for the Data Encryption Standard, or DES. This made it subject to the “brute force” of NSA’s unparalleled computing power. Further, it is widely believed that NSA also worked to have a back door in IBM’s DES, code containing a concealed and secret way into the crypto system, to enable surveillance of the public.

St. Jude: A Creative Force Among Early Cypherpunks

Born in Washington, DC in 1939, St. Jude was a self-taught programmer, hacker, activist, and writer. As a young adult she lived in Cleveland and was a part of its Beat scene. She volunteered in organizing efforts and took part in the Selma to Montgomery Civil Rights March in 1965, for which she was arrested and jailed. Her mug shot is a commonly published photo of her, symbolic of her commitment to civil rights throughout her life. She moved from the East Coast to San Francisco in 1968, embracing the counterculture movement of the Bay Area. In the late 1960s she was a programmer for the Berkeley Computer Company, an extension from the University of California, Berkeley’s famed time-sharing Project Genie.

Judith "St. Jude" Milhon
Judith "St. Jude" Milhon, 1965, Montgomery, Alabama Police Department. Mug Shot.

Active in Computer Professionals for Social Responsibility (CBI has the records of this important group), she was an influential voice in this organization. She was also one of the leaders of Project One’s Resource One, the first public computer network bulletin board in the US, which existed in the San Francisco area. She was known for her strong belief that network computer access should be a right not a privilege. She was an advocate for women technologists and acutely aware of the relative lack of women "hackers.” (the term meant skilled programmer, not necessarily its later meaning associated with malicious hacks).

St. Jude was a widely recognized feminist in computing and activism circles. She was among the founders of the "code rebels" and in giving the group the name that stuck, cypherpunks, it is suggestive of her having a voice in this male space (her writing and interviews suggest this strongly as well), but this was not necessarily (and probably not indicative of) a general acceptance of women in the group. Some of St. Jude’s views were at odds with academic feminism and gender studies areas but may have fit more with the cypherpunks’ ethos. She abhorred political correctness she saw in academic communities and educational and political elites. She believed technology would fix many problems, including social problems of gender bias and discrimination, “Girls need modems,” was her answer, and oft repeated motto and rallying statement. It was what she felt was needed to level the playing field (Cross, 1995).

The lack of women among the cypherpunks, St. Jude’s great frustration more women did not adopt her active hacker approach and ethic, likely suggests a dominant male and biased culture that only opened to certain great talent, creativity, and interactive style she possessed.

St. Jude became a senior editor and writer at Mondo in 2000, a predecessor publication that Wired drew from in style in writing about information technology. She also was lead author (with co-authors R.U. Sirius and Bart Nagel, Random House, 1995) of The Cyberpunk Handbook: The Real Cyberpunk Fakebook, (the subtitle a bit prophetic without intent of terminology given later formed Facebook and its profiteering off fake news) and along with her journalism she wrote science fiction. Judith “St. Jude” Milhon, passed from cancer in 2003.

Cyberpunk magazine

There definitely is a need for more historical research on gender and the cypherpunks as well as the sociology of gender in recent cryptocurrency projects, related intermediaries, and investors and users. Rudimentary contours nonetheless can be gently and lightly sketched from what is known. Names from the cypherpunks mailing list appearing in articles and handful of books addressing the topic are about 90 percent male. At the start St. Jude was the sole woman as a part of this core group. If limited to those directly interested and investigating possibilities with digital currencies before the advent of Satoshi Nakamoto’s Bitcoin in 2008, it was even more male dominated.

As such, women role models were very few in early public key efforts, and more broadly among the code rebels or cypherpunks overall. There are deep connections of the cypherpunks to Bitcoin, but also other early coins as well. Those young crypto entrepreneurs and activists of recent years and of today of course were never a part of the group, but nonetheless often grew an interest in it. They were motivated by its past activity, and had reverence for Tim May, Ed Hughes, John Gilmore, and others. This perhaps led to fewer opportunities perceived to be, or, open to women, and likely less of a recognition and consideration of pursuing this space among women.

Of the two exceptions of women in the upper echelons of cryptocurrency, one came from an equally talented and active wife and husband team (the Breitmans). The other was a truly exceptional individual, possibly deserving the term genius, who like Vitalik Buterin (Ethereum’s lead founder) achieved amazing things at a young age, was exposed to potential need for crypto, and was driven by the goal of socially impactful career success.

Tezos Co-Founder Kathleen Breitman  

Kathleen Breitman, Tezos, on Centre Stage
12 November 2021; Kathleen Breitman, Tezos, on Centre Stage during day one of Web Summit 2021 at the Altice Arena in Lisbon, Portugal. Photo by Harry Murphy/Web Summit via Sports file.

There are more than 14,000 altcoins, the top 30 are capitalized at $4 billion or more currently (value of circulating coins), and those not in the in the top 200 generally are less than $40 million in capitalization and in a precarious spot if they do not rise at least five-fold in the coming years. Many in the investment community have pejoratively labeled lesser capitalized altcoins (and for some Bitcoin enthusiasts, all altcoins) as “sh*t coins.” The cryptocurrency industry has resulted in a growing specialized trade and investment journalists, following Ethereum founder Vitalik Buterin’s initial pre-Ethereum pursuit of coin journalism, in creating Bitcoin Magazine. These include journalists, analysts, and evangelists (often all wrapped into one) in e-magazines such as The Daily HODL and Coin Telegraph, two of the more respected larger publications among many others. They write mainly on the top 50 coins, what most in the investment community cares about, and thus are writing very heavily about men, a reinforcing mechanism hindering perceived and real opportunities for women.

In the top 30 coins, only two have a woman founder or principal co-founder, none have a sole woman founder or sole leadership team in the top 30, and many are all male at the top. A few coins have a longer list in the founder’s group, upper single digits. The two principal co-founders of major altcoins are Kathleen Breitman of Tezos and Joyce Kim of Stellar Lumens. Tezos is $4 billion in capitalization and ranks 28th in altcoin cap., Stellar Lumens is at $4.8 billion and ranks 22nd.

The coin project “Proof-of-Stake”-modeled Tezos, was co-founded by Kathleen Breitman and her husband Arthur Breitman in 2018, along with a Tezos foundation created by Johann Gevers. Kathleen Breitman studied at Cornell University before joining a hedge fund and working as a consultant, Arthur Breitman is a computer scientist who worked in quantitative finance prior to Tezos. A dispute with the foundation and Gevers led the Breitmans into a lawsuit which delayed the launch and hurt the project, ultimately a payout settled the matter. Kathleen Breitman has stated that she has been underestimated in the industry as some assume her husband is the real creator when they very much co-created Tezos, technically and organizationally.

Stellar Lumen’s Co-Founder Joyce Kim

To say Joyce Kim’s career is impressive is an understatement, stellar is in fact quite fitting. Kim, a second-generation Korean American, grew up in New York City attending High School for the Humanities and graduated from Cornell University at age 19. Kim followed this with graduate school at Harvard University and Law School at Columbia University. She became a corporate M&A attorney as she also did pro bono work for Sanctuaries for Families and for the Innocence Project. Back in high school she witnessed the trouble and expense of lower income people globally sending money to families, it also was likely evident in work at Sanctuaries for Families.

After success with co-founding Simplehoney, a mobile ecommerce firm, as well as founding and serving as CEO of a Korean entertainment site, she became one of the rare (percentagewise) women in venture capital working at Freestyle Capital. Focusing on the power of social capital, she soon partnered with stable coin (crypto pegged to government fiat currency) Ripple founder Jed McCaleb in 2014 to found open source blockchain-based coin, network, and platform project Stellar Lumens, an effort of the nonprofit Stellar Development Foundation.

Kim’s motivation and vision for Stellar was driven by the fact that 35 percent of women adults globally (World Bank statistics) do not have a bank account despite many of them saving regularly. As such, they have trouble protecting, sending, and receiving funds, difficulties paying bills, helping family. Stellar as a platform and network allows people to send funds at low costs and low sums as easily as sending a DM or email. With 6.3 billion in the world with smartphones, and perhaps as many as 20 percent of these people without a bank account Stellar Lumens addresses a critical problem and serves a great societal need. The coin Celo is also in this very important area, making a positive difference in the world. Stellar Lumens (and Celo) change lives and empower lower income people, especially women as women are less likely than men to have bank accounts due to discrimination and lesser resources. As Kim told Fast Company in an interview shortly after the founding, with Stellar, she “found her true north.” (Dishman, 2015). In addition to Stellar Lumens, Kim recently served as a Director Fellow at the famed MIT Media Laboratory.

In addition to the prestigious MIT senior fellowship, Kim has moved on from Executive Director of Stellar, and the day-to-day of the coin and is having an impact socially and financially in the venture capital arena in crypto, an area that could benefit from more women. Kim is the Managing Partner at SparkChain Capital.

Mining Deeper: Guapcoin’s Tavonia Evans and the African Diaspora Community

At coins not in the top 30, 50, or 100 in capitalization projects teams work toward and hope their technology and mission will one day carry them to much higher levels. There are people and projects behind the coins and that is sometimes disrespectfully forgotten when investors or others refer to coins and projects in derogatory terms.

I wanted to research a coin in the middle third of the 14,000 or so coins out there in current capitalization and was deeply moved by learning about Guapcoin and its tremendous mission. It was founded in 2017 by African American data scientist Tavonia Evans. Evans, a mother of eight, had founded a peer-to-peer platform company earlier but was unsuccessful at getting venture funding. Venture capital is not on a level playing field and far less than one percent of venture funding goes to African American women led businesses. At this intersection--African American women--societal bias in finance is particularly pronounced.

Inability to get funding for that business led her to move on and inspired her Guapcoin project, a cryptocurrency focused on addressing “the economic and financial concerns of the Global African Diaspora community.” Evan’s vision with Guapcoin is beyond merely being a means of exchange for the Global African Diaspora community, and for “Buying Black,” but also a property protection mechanism that combats gentrification, documents all forms of property ownership (from real estate, to copyright, to music licenses) so “the Black and Brown community will have its wealth protected by a transparent, immutable blockchain.”

In 2019, Evans and Guapcoin founded the Guap Foundation to permanently ensure the mission of the coin project is carried out. Many altcoins have associated foundations to both further and to protect the integrity of the mission for generations to come (guapcoin.org).

It is with amazing, social-oriented and green projects like Guapcoin, Stellar Lumens, and Celo that I realized my initial negative perspective of cryptocurrency several years back, because of my very critical views on the environmental impact of Bitcoin, was sorely misguided for many 2016 and later altcoins, and for 2015 Ethereum that is converting to Proof-of-Stake as a consensus model to become green.

“Meetups” and Standout Early Scholarship on Gender and Cryptocurrency

Blockchain and the web

There are a mere handful of published scholarly studies to date examining gender and cryptocurrency. One stood out to me in being especially compelling in its creative methodology, insights, and importance. Simon Fraser University’s Philippa R. Adams, Julie Frizzo-Barker, Betty B. Ackah, and Peter A. Chow-White designed a project where they engaged in participant observation and interaction with over a half dozen “Meetup” events that were primarily, or at least in part, marketed to women, often to educate, encourage, or address gender disparity in cryptocurrency. All of these were in the Vancouver, British Columbia, metropolitan area.

Adams and her co-authors do a wonderful job of interpreting, analyzing, and eloquently conveying the meaning of these events. Some meetups were well designed and executed to offer support to women and empower women in this new industry and space. Others were far less effective, succumbing to the challenges of "trying to support adoption of a new technology," or they ended up presenting more resistance than support. I urge you to read this excellent work of scholarship (P. Adam, et al.), the chapter is in the recommended readings volume edited by Massimo Ragnedda and Giuseppe Destefanis, 2019, which is an excellent book overall and one of the first quality social science books on emerging Web 3.0).

Educational and Empowerment Organizations and Looking Forward

In addition to meetup events that are local in origin, a growing number of nationwide education and advocacy support organizations by and for women in cryptocurrency have emerged. Some foster local meetup events others have other supportive programs.

In Brooklyn, New York, Maggie Love founded SheFi.org in seeing blockchain as a powerful tool for more inclusive and equitable finance tools and systems. It engages in education to advance understanding and opportunities for women in blockchain and decentralized finance.

Global Women in Blockchain Foundation is an umbrella international organization without a central physical headquarters (in the spirit of the technology and decentralization). It is designed to accelerate women’s leadership roles in blockchain education and technology. The websites for these two organizations can be found on this site in the list of organizations.

Efforts to reduce the tremendous gender gap in cryptocurrency development projects and especially founder roles and leadership posts is extremely important, both ethically, and for the creativity, success, and potential of this field. Further, blockchain, and applications in crypto, are the heart of Web 3.0, the future of digital technology. If the field remains 90 percent male it will hurt the field of IT greatly by further reducing overall women's participation in IT, given blockchain greater share of the whole of our digital world.

There is not only a large gender gap in computer science, but also in finance, hedge funds, and venture capital, all which accentuate imbalances in power and opportunity in favor of men in crypto. The VC gender gap is especially problematic as it reinforces hurdles to women and BIPOC, independently and especially at these important intersections, for both small companies and cryptocurrency projects.

Joyce Kim and her leadership at SparkChain, funding crypto is so refreshing. The firm's staff is greatly diverse, in terms of both gender and race and ethnicity. More women in the VC leadership world and VCs with a crypto focus is incredibly important. It is also critical that education in both high school and college does not directly, or indirectly and inadvertently, create gendered spaces favoring men, or those inhospitable to women.

The excellent study by the team at Simon Fraser University looking at cryptocurrencies, and other studies looking at finance and hedge funds, have identified jargon and terminological barriers to entry. In crypto the barriers are many, from outright gender bias, to clubhouses, to other restrictive spaces, but terminology and cultures of exclusion are especially powerful in hindering inclusion, both intentionally and unintentionally.

One motivation for this blog and site and especially the site’s inclusion of a historical glossary of terms (continually added to) and a Cryptocurrency Historical Timeline is to contribute in a small way to education and first steps to remove barriers or blocks to inclusion based on terminology and cultural elements important to communication in this area. Anyone interested in this area and devoting time to it will soon move far beyond these resources, but they might help understanding a bit initially, at least that is a goal. I also see these as tools that can greatly benefit from the community.

I am continually learning from readings, correspondence, and meetings with others in this space. I have added to the readings already from useful comments and suggestions people have sent me after my first post last week. I hope these sources accelerate as community-used and community-influenced tools and thus I very much encourage and welcome feedback. I will take the timeline and glossary, through additions and tweaks, thus, many editions or iterations, but for now it gets at some of the technical and cultural terminology and basics (Why does the mantra of HODL, Hold On for Dear Life, keep coming up as crypto coins currently plummet? The glossary provides a historical context).

[Republished with only slight adjustment from Blockchain and Society: Political Economy of Crypto (A Blog), January 25, 2022) http://blockchainandsociety.com

[Please consider subscribing to the free blog at the URL above]


Bibliography

Abbate, Janet (2012). Recoding Gender: Women’s Changing Participation in Computing, MIT Press.

Adams, Philippa R., Julie Frizzo-Barker, Betty B. Ackah, and Peter A. Chow-White (2019). In Ragnedda, Massimo and Giuseppe Destefanis, eds. Blockchain and Web 3.0: Social, Economic, and Technological Challenges, Routledge.

Brunton, Finn. (2021). Digital Cash: The Unknown History of the Anarchists, Utopians, and Technologists Who Created Cryptocurrency, NYU Press.

Celo Website. www.celo.org

Cross, Rosie (1995). “Modern Grrrl.” Interview with Judith “St. Jude” Milhon. Wired, February 1. www.wired/1995/02/st.-jude/

Dishman, Lydia. (2015). “The Woman Changing How Money Moves Around The World.” Fast Company February 6.

Hao, Karen. (2018). “Women in Crypto Are Reluctant to Admit There Are Very Few Women in Crypto.” Quartz (qz.com). May 5, 2018. https://www.qz.com

Hicks, Marie (2017). Programmed Inequality: How Britain Discarded Women Technologists and Lost Its Edge in Computing, MIT Press.

Guapcoin Website. www.guap.org

Gross, Elana Lyn. (2019). “How to Close the Venture Capital Gender Gap Faster.” Forbes, May 20.

Klemens, Sam. (2021). “10 Most Influential People in Crypto: Kathleen Breitman.” Exodus. August 3.

Misa, Thomas J., Ed. (2010). Gender Codes: Why Women are Leaving Computing, Wiley-IEEE.   

Misa, Thomas J. (2021). “Dynamics of Gender Bias in Computing.” Communications of the ACM 64: 6, 76-83.

St. Jude, R.U. Sirius, Bart Nagel (1995). The Cyberpunk Handbook, Random House.

Yost, Jeffrey R. (2015). “The Origin and Early History of the Computer Security Software Industry.” IEEE Annals of the History of Computing, 32:7, April-June, 46-58.

Yost, Jeffrey R. (2016). “The March of IDES: The Advent and Early History of the Intrusion Detection Expert Systems.” IEEE Annals of the History of Computing, 38:4, October-December, 42-54.

Yost, Jeffrey R. (2017). Making IT Work: A History of the Computer Services Industry, MIT Press.

                                                                                  

Jeffrey R. Yost (January 2022). “Few Women on the Block: Legacy Codes and Gendered Coins,” Interfaces: Essays and Reviews on Computing and Culture Vol. 3, Charles Babbage Institute, University of Minnesota, 1-18.


About the Author: Jeffrey R. Yost is CBI Director and HSTM Research Professor. He is Co-Editor of Studies in Computing and Culture book series with Johns Hopkins U. Press, is PI of the new CBI NSF grant Mining a Useful Past: Perspectives, Paradoxes and Possibilities in Security and Privacy. He has published 6 books, dozens of articles, and has led or co-led ten sponsored projects, for NSF, Sloan, DOE, ACM, IBM etc., and conducted hundreds of oral histories. He serves on committees for NAE, ACM, and on two journal editorial boards.

 

 

Before the Byte, There Was the Word: The Computer Word and Its Many Histories

Johannah Rodgers

Abstract: Tracing and documenting the genealogies of what, in the twentieth century, will become known as “the computer word,” this article explores the importance of the term to the histories and presents of digital computings, the technical and rhetorical functions of verbal language involved with its emergence in the mid-twentieth century U.S., and the import of term’s currency in discourse networks forged across industry, government sponsored university research initiatives, and popular media.

PDF version available for download.

Defining.Words.IEEE.Glossary 1
Illustration 1

What We Know

Unlike the terms bits and bytes, the computer word, which is defined by the IEEE as "a unit of storage, typically a set of bits, that is suitable for processing by a given computer," (Illustration 1) has not yet become part of popular discourse. Instead the term remains a technical one, familiar to every computer scientist and technician but not to the average consumer. Also unlike the terms bits and bytes, the origins of which have become part of the print record (bits is said to date from a January 9, 1947 Bell Labs memo drafted by John W. Tukey and byte from a June 11, 1956 IBM memo drafted by Werner Bucholz) (Tropp), those surrounding the computer word have not been well documented in either the histories of computings or fields related to it, including writing and media studies. Delving into the histories of computings archive, it is possible to identify a narrow time frame in which the term begins its emergence, sometime between late spring 1945, when John von Neumann drafts his notes that will later be referred to as the "First Draft of the EDVAC Report" and September, 1945, when J. Presper Eckert, John Mauchly, et al., compile their report entitled "Automatic High Speed Computing: A Progress Report on the EDVAC." Yet, the story of the "computer word" is, like many in the histories of computings, neither a classic origin story nor one with a sole author/inventor or single conclusion. Rather, it is collaboratively authored, recursive in its structure, and has implications that, I believe, are only beginning to be fully explored. 

Every electronic computational machine since the ENIAC, the first fully electronic computing project in the U.S., has been described as having a "word size" and as containing a certain number of “words.” Acting as an interface between hardware and what will later become known as software, the computer word becomes one of the building blocks for machine and programming languages. It is one part of the process that enables hardware and the instructions controlling it to communicate and "understand" one another. Technically, choosing a computer word's "word size" is one of the earliest steps in chip design and, metaphorically, the computer word can be said to function as a word does in "telementational models" (Harris) of human to human communication: it allows for information to be transmitted and exchanged according to a "standard" meaning. While, in human communication, verbal words, i.e., those spoken or inscribed by humans, rarely maintain a fixed meaning, in machine communication, the computer word and the bytes and bits that will later be said to compose it have been, over time, made to adhere to such standards. Figuratively, if bits can be said to be the millimeters of digital electronic computers and bytes the centimeters, the computer word can be said to function as the meter.

Electronic Discrete Variable Automatic Computer
Illustration 2: Two women operating the ENIAC's main control panel while the machine was still located at the Moore School. "U.S. Army Photo" from the archives of the ARL Technical Library. Left: Betty Jennings (Mrs. Bartik) Right: Frances Bilas (Mrs. Spence).

Despite the technical significance of the computer word to the historic and current functions of digital electronic computers, documenting its histories is, for several reasons, anything but straightforward, in part because of the complexities of the EDVAC project itself, in part because of the later depictions of the EDVAC project in relation to projects predating it, and in part because of issues related to the histories of computings archives. Unlike the ENIAC, the EDVAC project unfolded during a time of transition from war-time to post-war based funding priorities for the U.S. military and from university-focused to industry-focused research and development initiatives. As a result, it is one that continues to provide scholars with a wealth of material and issues (technical, economic, political, and socio-cultural) to consider in relation to other electronic and electro-mechanical computing projects in the United States. The EDVAC project was, as Michael Williams has clearly documented, a fraught one and produced a machine that may actually have only been operational for a very short time and differed considerably from initial design documents. Further, recent research related to the ENIAC project by Haigh, Priestley and others, has emphasized the similarities rather than the differences between the ENIAC and EDVAC projects and called into question the portrayal of the "Von Neumann" architecture as the invention of von Neumann or a clear departure from the architectures of earlier "computing" projects in the U.S.

The availability, accessibility, and reliability of documentary archival materials also all play roles in how the (hi)stories of the computer word can be told. Just to point to two selected examples related to my research for this project, the digital copy of Eckert and Mauchly's "Automatic High Speed Computing" report available in the archive of the Museum of the History of Computing is an excerpt of the complete report. While this particular copy is valuable to researchers since it is from the archive of Donald Knuth and contains his notes, at present, no complete digital copy of the report exists that is publicly accessible. It was, in fact, only through the very generous support and assistance of the University of Pennsylvania Libraries Special Collections that I was able to remotely access a digitized copy. The existence of as yet uncatalogued materials raises other issues unique to the histories of computings archives. As a result of his ongoing research involving the Goldstine papers at the American Philosophical Association archive, Mark Priestly has drawn attention to manuscripts and unpublished lecture notes with significant implications for how not only specific terms, including the computer word, are interpreted but to other topics, such as how Turing's work may have been used by von Neumann.   

EDVAC.Report.Footnote 2
Illustration 3: EDVAC Report Footnote.

While the paper trail documenting this term "computer word" will be for some time still unfolding, what we do have currently are paper traces documenting its evolution from a term with several different functions as a rhetorical device to a technical term and finally to a technical standard. The September, 1945 "Progress Report on the EDVAC" appears to be the first time that the term “word” is used in an official document and proposal. Attributed to von Neumann in a footnote (Illustration 3), the term "word" (without quotation marks!) is introduced in a manner with more than slight Biblical overtones: "each pulse pattern representing a number or an order will be called a word*" (Eckert, et al.) In the earlier, June, 1945 Draft EDVAC report, von Neumann refers to the unit that will later be referred to as a “word” as a “code word," a term that references the operations of telegraphic machines (Priestley) and also likely the “codes” contained on the punched cards used to feed program instructions to early automatic calculators, including the ENIAC and the Mark I. Although there are references to a/the "word," "words," and specific types of "words," i.e., logical words, machine words, instruction words, control words, in documents throughout the 1950s, the earliest mention of the term "computer word" in all likelihood appears later, around 1960 (Stibitz, COBOL Report).

What We Are Still Learning

These findings reveal some useful insights into both what we know and to what we are still in the process of learning about the histories of computings and the roles of verbal language, linguistics, and language education in them.

As Nofre, et al., emphasize in their 2014 article "When Technology Became Language," the mid-1940s represent an important inflection point in how electronic digital computers are being conceived and discussed as "understanding" and engaging with language. The introduction of the term "word" to describe the operations of the EDVAC architecture appears to be one part of the discursive and technical transformation of high speed automatic calculators to general purpose digital electronic computers and to language processing (if not yet language possessing) machines. One of the key differences between the ENIAC and the EDVAC was the addition of new types of electronic storage media and its use for not only storing but manipulating codes “internally” to instruct the machine (Burks). One part of the external memory in the ENIAC, as Eckert explains it in his first Moore lecture, was “the human brain” (116). The ENIAC was a decimal-based calculator and required significant input from skilled human operators in order to function. Both issues are noteworthy because part of the story of this metaphor of the word has to do with its emergence at the same time that there is a moving away from human to machine readable writing systems (decimal to binary) and communication and storage media (wires to pulses; cards, paper, and human brains to short and long tanks and mercury delay lines). In a Turing complete machine, the machine must have some way of representing its own operations; both the ENIAC and the EDVAC had this capability. However, one major difference between these two machines was the manner in which instructions were represented and communicated so the machine could “understand” them. With the ENIAC, wires were used as the system of notation; with the EDVAC, the alphabet became another system of notation (Alt).   

These paper traces from the 1940s also underscore the collaborative environment in which military funded research projects were being developed and documented in the U.S. While I am not suggesting with this term "collaborative" that in such an environment everyone was acting communally or even getting along while they worked together, I am arguing that fluidity and responsive improvisation are evident both in the discourse and the technical systems being described. What this means to documenting the genealogy of the computer word is that if the neologism is to be attributed to von Neumann, it would be necessary to put quotation marks around the terms "computer word" and "von Neumann" to indicate that both are names applied retroactively to fix the meanings of phenomena that were still emergent when placed in their specific historical contexts. Neither the "computer word" nor what is now still frequently referred to as the "Von Neumann Architecture" emerges fully formed as a concept or technical standard in 1945 when von Neumann drafted his notes for the EDVAC project in Los Alamos, New Mexico, referred to the unit that will be used to communicate data and instructions in the EDVAC project as a "code word," and handed his notes to one or more secretaries to type up and possibly reproduce and circulate (Williams).

The absence of a single author or origin story for terms like the computer word reinforces the importance of analyzing the rhetorical contexts in which the word choices of von Neumann/"von Neumann" and others are made, as well as the processes of exchange and circulation of these terms (Nofre, Martin). Paul Ceruzzi's recent article in this journal about the myths of immateriality surrounding the natural resource intensive reality of "cloud"-based computing is one example of the power of names to shape discourse and its receptions (Ceruzzi). Another example relates to the complexities involved in attempts by Tropp to depict the emergence of the term "bit" as a story with a single author. As the multilayered documentation presented in Tropp's article makes clear, there were many contributors to the creation of the neologism bit, which become metaphorically and technically imbricated in human discourse and the engineering projects they inform and describe. The term "bit," which was originally named an "alternative" by Shannon, was deemed at one point a "bigit," and, while, Tukey's memo may have been the first document we currently have access to in which the term appears, even a cursory reading of it reveals that drawing from it the conclusion that Tukey "coined" the term is far from certain (Tropp).

EDVAC.Report.ENIAC.EDVAC.Comparison 3
Illustration 4: EDVAC Report ENIAC-EDVAC Comparison

What's In a Word?: Teaching Machines to Read and Write

After 1946, the "signifying operations" (Rodgers) of terms related to and involving language as applied to digital electronic computers continue to widen in the technical, industrial, and popular literature (Nofre, Martin). Part of my interest in this relates to the fact that what we now call the early history of digital electronic computing, calculating machines are constructed based on models of the human, which are then explained via metaphors to influence decisions being made regarding the funding of educational and work initiatives for human computers and electronic computers based on the costs and interchangeability of the two (Grier). As Grace Hopper will point out in her 1952 article "The Education of a Computer," the EDVAC architecture is identical to a schematized cognitive model of a writing/calculating subject. In this context, the date and location of the drafting of the EDVAC Notes and later report are both significant considering their purpose and later reception, as are the later technical decisions that placed the affordances and performance standards of machinic operability over those of human legibility in the EDVAC project (Illustration 4). Yet, it is in part through verbal logic and the deployment of rhetoric that decisions were made regarding whose and what logics and languages would become "hard wired" into digital electronic computing machines.

Depiction of Human Brain as Instrument for Creation, Storage, and Exchange of Word Images
Illustration 5:  Depiction of Human Brain as Instrument for Creation, Storage, and Exchange of Word Images (Starr, 1895)

Somewhat ironically, though, perhaps, inevitably, it is with issues of representation and the roles of spoken and inscribed language in depicting and constructing realities and histories that the intrigue involved with interpreting relationships between alphabetic words and computer words really begins. While it is impossible to know the exact reasons for a specific word choice, it is possible to consider the rhetorical contexts in which the EDVAC report was written. The term "word" functioned both to signal what was new about the project while also performing some explanatory work to an audience deciding the fate of the EDVAC funding proposals. Yet the target domain of this metaphor is human language processing, which it is presumed, rather than proved, that the proposed technical system will replicate (Harris). In giving the EDVAC calculating machine the ability to "instruct" itself with the metaphor of the word, binary arithmetic calculation is paired with alphabetic communication in a way that has implications for the processes involved with both and based on assumptions about how language functions, what the purposes of communication are, and for the benefit of specific parties and interests (Dick). From a writing studies perspective, the word choice of the "code word" and "word" connect the mid-1940s with the instrumentalization of writing and human writing subjects that had been occurring throughout the late nineteenth and early twentieth centuries (Gitelman, Rodgers) and to early discussions of "AI" and the roles and histories of writing, logic, and language education policies embedded in them (Kay, De Mol, Heyck). 

Acknowledgments

I am grateful to Joseph Tabbi, Cara Murray, and Robert Landon for their comments and suggestions related to earlier drafts of this article.  Thank you also to Jeffrey Yost for his insightful suggestions, and to Amanda Wick and Melissa Dargay for their work and contributions. Holly Mengel, and David Azzolina for their assistance in remotely accessing a digital copy of  "Automatic High Speed Computing: A Progress Report on the EDVAC," and to Donald Breckenridge for his practical, editorial, and emotional support. Finally, a special thanks to John H. Pollack, Curator, Kislak Center for Special Collections, Rare Books and Manuscripts at the University of Pennsylvania Libraries, and his colleagues Charles Cobine, Eric Dillalogue.

 


Bibliography

Alt, Franz. (July 1972). "Archaeology of Computers: Reminiscences, 1945-1947." Communications of the ACM, vol. 15, no. 7, pp. 693–694. https://doi.org/10.1145/361454.361528.

Burks, Arthur W. (1978). "From ENIAC to the Stored-Program Computer: Two Revolutions in Computers." Logic of Computers Group, Technical Report No. 210. https://deepblue.lib.umich.edu/handle/2027.42/3961.

Burks, Arthur W., Herman H. Goldstine, and John von Neumann. (28 June 1946). Preliminary Discussion of the Logical Design of an Electronic Computer Instrument. Institute for Advanced Study. https://library.ias.edu/files/Prelim_Disc_Logical_Design.pdf

Campbell-Kelly, M. and Williams, M. R., eds. (1985). The Moore School Lectures: Theory and Techniques for Design of Electronic Digital Computers, volume 9 of Charles Babbage Institute Reprint Series for the History of Computing. MIT P. https://archive.org/details/mooreschoollectu0000unse.

Ceruzzi, Paul E. (2021). "The Cloud, the Civil War, and the 'War on Coal.' Interfaces: Essays and Reviews in Computing and Culture, Charles Babbage Institute, University of Minnesota. https://cse.umn.edu/cbi/interfaces.

De Mol, Liesbeth and Giuseppe Primiero. (2015). "When Logic Meets Engineering:

Introduction to Logical Issues in the History and Philosophy of Computer Science." History and Philosophy of Logic, vol. 36, no. 3, pp. 195-204. https://doi.org/10.1080/01445340.2015.1084183.

Dick, Stephanie. (April–June 2013). "Machines Who Write." IEEE Annals of the History of Computing, Vol. 35, No. 2, pp. 88-87. https://doi.org/10.1109/MAHC.2013.21.

Eckert, J. P. (1985). "A Preview of a Digital Computing Machine." The Moore School Lectures: Theory and Techniques for Design of Electronic Digital Computers, volume 9 of Charles Babbage Institute Reprint Series for the History of Computing, edited by M. Campbell-Kelly and M.R. Williams, MIT P. https://archive.org/details/mooreschoollectu0000unsepp. 109-128.

Eckert, J. P. and Mauchly, J. W. (September 30, 1945). Automatic High Speed Computing: A Progress Report on the EDVAC, Moore School of Electrical Engineering, University of Pennsylvania.

Gitelman, Lisa. (1999). Scripts, Grooves, and Writing Machines: Representing Technology in the Edison Era. Stanford UP.

Heyck, Hunter. (2014). “The Organizational Revolution and the Human Sciences.” Isis, vol. 105, no. 1, pp. 1–31. https://www.journals.uchicago.edu/doi/10.1086/675549.

Harris, Roy. (1987). The Language Machine. Duckworth.

Haigh, Thomas, Mark Priestley, and Crispin Rope. (January-March 2014). "Reconsidering the Stored-Program Concept." IEEE Annals of the History of Computing, vol. 36, no. 1, pp. 40-75. https://doi.org/10.1109/MAHC.2013.56.

Haigh, Thomas and Mark Priestley. (January 2020). "Von Neumann Thought Turing's Universal Machine was 'Simple and Neat.': But That Didn't Tell Him How to Design a Computer." Communications of the ACM, vol. 63, no. 1, pp. 26-32. https://doi.org/10.1145/3372920.

Hopper, Grace M. “The Education of a Computer.” (1952). Proceedings of the 1952 ACM National Meeting (Pittsburgh), edited by ACM and C. V. L. Smith. ACM Press, pp. 243–49.

IEEE Standards Board. (1995). IEEE Standard Glossary of Computer Hardware Terminology. IEEE. doi: 10.1109/IEEESTD.1995.79522.

Kay, Lily. (2001). "From Logical Neurons to Poetic Embodiments of Mind: Warren S. McCulloch's Project in Neuroscience." Science in Context vol. 14, no. 4, pp. 591-614.

Martin, C. Dianne. (April 1994). "The Myth of the Awesome Thinking Machine." Communications of the ACM, vol. 36, no.4, pp: 120-33.

Nofre, David, Mark Priestley, and Gerard Alberts. (2014). "When Technology Became Language: The Origins of the Linguistic Conception of Computer Programming, 1950–1960." Technology and Culture, vol. 55, no. 1, pp. 40-75.

Priestley, Mark. (2018). Routines of Substitution: John von Neumann’s Work on Software Development, 1945–1948. Springer.

Report to Conference on Data Systems and Languages Including Initial Specifications for a Common Business Oriented Language (COBOL) for Programming Electronic Digital Computers. (April 1960). Department of Defense.

Rodgers, Johannah. (July 2020). "Before the Byte, There Was the Word: Exploring the Provenance and Import of the 'Computer Word' for Humans, for Digital Computers, and for Their Relations." (Un)-Continuity: Electronic Literature Organization Conference, 16-18 July 2020, University of Central Florida, Orlando, FL, USA. https://stars.library.ucf.edu/ elo2020/asynchronous/talks/11/.

Starr, M. Allen. (1895). "Focal Diseases of the Brain." A Textbook of Nervous Diseases by American Authors edited by Francis X. Dercum, Lea Brothers. https://archive.org/details/b21271161

Stibitz, George R. (1948). “The Organization of Large-Scale Computing Machinery.”

Proceedings of a Symposium on Large-Scale Digital Calculating Machinery, Harvard UP, pp. 91–100.

Tropp, Henry S. "Origin of the Term Bit." (April 1984). Annals of the History of Computing, vol. 6, no. 2, pp. 154-55. https://dl.acm.org/doi/10.5555/357447.357455.

von Neumann, J. (June 30, 1945). First Draft of a Report on the EDVAC. Moore School of Electrical Engineering, University of Pennsylvania. https://history-computer.com/Library/edvac.pdf.

von Neumann, J. (1993). "First Draft of a Report on the EDVAC." IEEE Annals of the History of Computing, Vol. 15, No. 04, pp. 27-75. doi: 10.1109/85.238389.

Williams, Michael R. (1993). "The Origins, Uses, and Fate of the EDVAC." IEEE Annals of the History of Computing, Vol. 15, No. 1, pp. 22-38.

 

Johannah Rodgers (September 2021). “Before the Byte, There Was the Word: The Computer Word and Its Many Histories.” Interfaces: Essays and Reviews on Computing and Culture Vol. 2, Charles Babbage Institute, University of Minnesota, 76-86.


About the author:

Johannah Rodgers is a writer, artist, and educator whose work engages creatively and critically with the histories and presents of print and digital technologies to explore their connections and their roles in the sociologies and economics of literacies in the U.S.  She is the author of Technology: A Reader for Writers  (Oxford University Press, 2014), the Founding Director of the First Year Writing Program at the New York City College of Technology, where she was Associate Professor, and a participant in the 2020-21 University of Cambridge Mellon Sawyer Seminar on the Histories of AI.  You can read more about her projects and publications at www.johannahrodgers.net.

 

 

Early “Frictions” in the Transition towards Cashless Payments

Bernardo Bátiz-Lazo (Northumbria) and Tom R. Buckley (Sheffield)

Abstract: In this article we describe the trials and tribulations in the early stages to introduce cashless retail payments in the USA. We compare efforts by financial service firms and retailers. We then document the ephemeral life of one of these innovations, colloquially known as “Hinky Dinky”. We conclude with a brief reflection on the lessons these historical developments offer to the future of digital payments.

(PDF version available for download.)

Hinky Dinky supermarket ca. 1970s.
A shopper leaving a Hinky Dinky Supermarket. ca. 1970s.

Photo credit: “Supermarkets as S&L Branches,” Banking Vol. 66 (April 1974) pg. 32

Let’s go back to the last quarter of the 20th century. This was a time when high economic growth in the USA that followed the end of World War II was coming to an end, replaced by economic crisis and high inflation. It was a time where cash was king, and close to 23% of Americans worked in manufacturing. A time when the suburbs – to which Americans had increasingly flocked after 1945 escaping city centres – were starting to change. Opportunities for greater mobility were offered by automobiles, commercial airlines, buses, and the extant railway infrastructure.

This was the period that witnessed the dawn of the digital era in the United States, as information and communication technologies began to emerge and grow. The potential of digitalisation provided the context in which an evocative idea, the idea of a cashless society first began to emerge. This idea was associated primarily with the elimination of paper forms of payment (primarily personal checks) and the adoption of computer technology in banking during the mid-1950s (Bátiz-Lazo et al., 2014). Here it is worth noting that, although there is some disagreement as to the exact figure, the volume of paper checks cleared within the U.S. had at least doubled between 1939 and 1955, and the expectation was, that this would continue to rise. This spectacular rise in check volume, with no corresponding increase in the value of deposits, placed a severe strain on the U.S. banking system and lead to a number of industry-specific innovations emerging from the 1950s such as the so-called ERMA and electronic ink characters (Bátiz-Lazo and Wood, 2002).

The concept of the cashless, checkless society became popularised in the press on both sides of the Atlantic in the late 1960s and early 1970s. Very soon the idea grew to include paper money. At the core of this imagined state was the digitalization of payments at the point of sale, a payment method that involved both competition and co-operation between retailers and banks (Maixé-Altés, 2020 and 2021).

Hagley Museum and Archives Philadelphia Savings Fund Society Collection 2062 Box 13 PSFS Online News Bulleting, Vo 80-102, June 23 1980.
Early Point of Sale terminals, ca. 1970s.

Photo credit: Hagley Museum and Archives Philadelphia Savings Fund Society Collection 2062 Box 13 PSFS Online News Bulleting, Vo 80-102, June 23 1980.

In the banking and financial industry new, transformative technologies thus began to be trialled and developed in order to make this a reality (Maixé-Altés, 2019). Financial institutions accepting retail deposits had been at the forefront of the adoption of commercial applications of computer technology (Bátiz-Lazo et al., 2011). Early forms of such technical devices mainly focused on improving “back office” operations and encompassed punch card electromechanical tabulators in the 1920s and 1930s; later, in the 1950s, analogue devices (such as the NCR Post Tronic of 1962) were introduced, and, in the late 1960s the IBM 360 became widely adopted.  But at the same time, regulation curtailed diversification of products and geography (limiting the service banks could provide their customers). These regulatory restrictions help to explain ongoing experiments with a number of devices which involved a significant degree of consumer interaction including credit cards (Stearns, 2011), the use of pneumatic tubes and CCTV in drive through lanes, home banking, and Automated Teller Machines (ATMs), which despite being first introduced in the late 1960s and early 1970s, would ultimately not gain acceptance until the early 1980s (Bátiz-Lazo, 2018).

Like the banking and financial industry, the retail industry, with its very real interest in point of sale digitalization, was exposed to the rise of digital technology in the last quarter of the 20th Century. The digitalisation of retailing occurred later than in other industries in the American economy (for a European account see Maixé-Altés and Castro Balguer, 2015). Once it arrived, however, the adoption of a range of digital technologies including Point of Sale (POS) related innovations such as optical scanning, and the universal product code (UPC), were extensive and transformed the industry (Cortada, 2003). From the perspective of historical investigation, the chronological place of such innovation, beginning in the mid-1970s, is associated with a remarkable period of rapid technological change in U.S. retailing (Basker, 2012; Bucklin 1980). Along with rapid technological change, shifts in the structure of retail markets, in particular the decline of single “mom and pop stores” and the ascent of retail chains also became more pronounced in the 1970s (Jarmin, Klimek and Miranda, 2009). Two decades later, such large, retail firms would account for more than 50% of the total investment in all information technology by U.S. retailers (Doms, Jarmin and Kilmek, 2004). 

What connects the transformative technological changes that occurred in both the banking industry and the retail industry during this period, is that both sought to utilise Electronic Funds Transfer Systems, or EFTS, a way to reduce frictions for retail payments at the point of sale. During the 1970s and 1980s, the term EFTS was used in a number of ways. Somewhat confusingly, it was applied indistinctively to specific devices or ensembles, value exchange networks, and what today we denominate as infrastructures and platforms.  While referring to it as a systems technology for payments it was defined as one:

 “in which the processing and communications necessary to effect economic exchange and the processing and communications necessary for the production and distribution of services incidental to economic exchange are dependent wholly or in large part on the use of electronics” (National Commission on Electronic Funds Transfer, 1977, 1).  

Ultimately EFTS would come to be extended to the point of sale and embodied in terminals which allowed for automatic, digital, seamless transfer of money from the buyer’s current account to the retailer’s, known as the Electronic Funds Transfer at the Point of Sale, or EFTS-POS (Dictionary of Business and Management: 2016). 

One of the factors that initially held back the adoption of early EFTS and the equipment that utilities it, was the lack of infrastructure that would connect the user, the retailer, and the bank (or wherever the user’s funds were stored). As Bátiz-Lazo et al. (2014) note the idea of a cashless economy that would provide this infrastructure was highly appealing… but implementing its actual configuration was highly problematic. Indeed, in contrast to developments in Europe, some lawmakers in Congress considered the idea of sharing infrastructure by banks as a competitive anathema (Sprague, 1977). Large retailers such as Sears had a national presence and were able to consider implementing their own solution to the infrastructure problem. Small banks looked at proposals by the likes of Citibank with scepticism while they feared it may pivot the dominance of large banks. George W. Mitchell (1904-1997), a member of the Board of the Federal Reserve, and management consultant John Diebold (1926-2005), were outspoken promoters of the adoption of cashless solutions but their lobbying of public and private spheres was not always successful. Perhaps the biggest chasm between banks and retailers though, resulted from the capital-intensive nature of the potential network and infrastructure that any form of EFTS required.

 

Early use of an ATM at Dahl’s Foods supermarket in Iowa, circa 1975.
Early use of an ATM at Dahl’s Foods supermarket in Iowa, circa 1975.

Photo credit: Courtesy of Diebold-Wincor Inc.

Amongst the alternative solutions that were trialled by banks and retailers, there were a number of successes, such as ATMs (Bátiz-Lazo, 2018) and credit cards (Ritzer, 2001; Stearns, 2011). Both bankers and retailers were quick to see a potential connection between the machine-readable cards and the rapid spread of new bank-issued credit cards under the new Interbank Association (i.e., the genesis of Mastercard) and the Bankamericard licensing system (i.e., the genesis of Visa), both of which began in 1966, just as the vision of the cashless society was winning acceptance. Surveys from the time indicate that at least 70 percent of bankers believed that credit cards were the first step toward the cashless society and that they were entering that business in order to be prepared for what they saw as an inevitable future (Bátiz-Lazo et al., 2014).

There were also a number of less successful attempts that, far from being relegated to the ignominy of the business archives, offer an important insight into the implementation of a cashless economy which is worth preserving for future generations of managers and scholars. Chief amongst these is a system widely deployed by the alliance of U.S. savings and loans (S&L) with mid-sized retailers under the sobriquet “Hinky Dinky”. Interestingly, Maixé-Altés (2012, 213-214) offers an account of a similar, independent, and contemporary experiment in, a very different context, Spain. The Hinky Dinky moniker was derived from an experiment by the Nebraskan First Federal Savings and Loan Association, which in 1974 located computer terminals into stores of the Hinky Dinky grocery chain - which at its apex operated some 50 stores across Iowa and Nebraska. The Hinky Dinky chain was seen by the First Federal Savings and Loan Association as the perfect retail partner for this experiment owing to the supermarket’s popularity with local customers; an appeal that would be beneficial to this new technology. The popularity of Hinky Dinky was particularly valuable, as the move by First Federal Savings and Loans, to establish an offsite transfer system challenged, but did not break banking law at that time (Ritzer, 1984).

At the heart of the technical EFT system initiated by First Federal, formally known as Transmatic Money Service, was a rudimentary, easy-to-install package featuring a point-of- service machine, with limited accessory equipment in the form of a keypad and magnetic character reader. The terminal housed in a dedicated booth within the store and was operated by store employees (making a further point of the separation between bank and retailer). The terminal enabled the verification and recording of transactions as well as the instant updating of accounts.  The deployment of the terminals in Hinky Dinky stores shocked the financial industry because it made the Nebraska S&L appear to be engaging in banking activities, while the terminals themselves provided banking services to customers in a location that was not a licensed bank branch! 

PSFS Act One Photo creditHagley Museum and Archives Philadelphia Savings Fund Society Collection 2062 Box 13 PSFS Online News Bulleting, Vo 79-127, September 13, 1979.
The Philadelphia Savings Fund Society's negotiable order of withdrawal (NOW) account machine, "Act One."

Photo credit: Hagley Museum and Archives Philadelphia Savings Fund Society Collection 2062 Box 13 PSFS Online News Bulleting, Vo 79-127, September 13, 1979.

From its origins in a mid-sized retail chain in the Midwest, some 160 “Hinky Dinky” networks appeared across the USA between 1974 and 1982, before S&Ls abandoned them in favour of ATMs and credit cards. These deployments included a roll out in 1980 by the largest savings banks by assets in the USA at the time, the Philadelphia Savings Fund Society or the PSFS. Rather than commit to the large capital investment that ATMs necessitated, without guarantees of its viability or a secure return on investment, the PSFS pivoted the “Hinky Dinky” terminals as part of the rolled out of its negotiable order of withdrawal (NOW) accounts (commercialised as “Act One”).

The NOW accounts were launched in the early 1970s by the Consumer Savings Bank, based in Worcester, MA (today part of USBank), as way to circumvent the ban on interest payment and current account deposits imposed on S&Ls by Depression era regulation. Between 1974 and 1980, Congress took incremental steps to allow NOW accounts nationwide, something the PSFS wanted to take advantage of. Consequently, in February 1979, the PSFS signed an agreement with the Great Atlantic and Pacific Tea Company (A&P) to install Transmatic Money Service devices in 12 supermarket locations. This was part of the PSFS wider strategy “to provide alternative means for delivering banking services to the public” (Hagley Archives: PSFS Collection).

These terminals did not, however, allow for the direct transfer of funds from the customer’s accounts to the retailers. Rather the terminals, which were operated by A&P employees, were activated by a PSFS plastic card that the society issued to customers, and enabled PSFS customers with a Payment and Savings account to make withdrawals and deposits. The terminals also allowed PSFS cardholders and A&P customers to cash cheques.

The equipment used by PSFS, the Hinky Dinky devices, therefore represent an interesting middle ground which improved transaction convenience for consumers, was low risk for the retailer and was relatively less costly for banks and financial institutions than ATMs (Benaroch & Kauffman, 2000). 

One of the most interesting features of the Hinky Dinky terminals as they were deployed by the PSFS and First Federal Savings, was that they represent co-operative initiatives between retail organisations and financial institutions. As mentioned before, this was not necessarily the norm at the time. As the legal counsel to the National Retail Merchants Association (a voluntary non-profit trade association representing department, speciality and variety chain stores) wrote in 1972: “Major retailers… have not been EFTS laggards. However, their efforts have not necessarily or even particularly been channelled toward co-operative ventures with banks,” (Schuman, 1976, 828). These sentiments were echoed by more neutral commentators who similarly highlighted the lack of dialogue between retailers and financial institutions on the topic of EFTS (Sprague, 1974). The extent to which retailers provided financial services to their customers had long been a competitive issue in the retail industry: the ability of chain stores, such as A&P in groceries and F.W. Woolworth in general merchandise, to offer low prices and better value owed much to their elimination of credit and deliveries (Lebhar, 1952). With the advent of EFT retail organisation’s provision of financial services raised the prospect of this becoming a competitive issue between these two industries.

All from J.C Penney 1959  Annual Report
A JC Penney customer applying for a credit card ca. 1950s.

Photo credit: From J.C Penney 1959 Annual Report.

The prospect of a clash between retailers and banks was increased moreover, as there had always been other voices, other retailers, who had been willing to offer credit (Calder, 1999). In the early years of the 20th century, consumer demand for retailers to provide credit grew. This caused tension with the cash only policies of department store such as A.T. Stewart and Macy’s, and the mail order firms Sears Roebuck and Montgomery Ward (Howard, 2015). Nevertheless, it was hard to ignore such demand as evidenced by Sears decision to begin selling goods on instalment around 1911 (Emmet and Jeuck, 1950, 265). Twenty years later, in 1931, the company went a stage further by offering insurance products to consumers through the All State Insurance Company. Other large retail institutions, however, resisted the pressure to offer credit until much later (J.C. Penney for instance would not introduce credit until 1958). Credit activities by large retailers, nonetheless, were determinant for banks to explore their own credit cards as early as the 1940s while leading to the successes of Bankamericacard and the Interbank Association in the 1960s (Bátiz-Lazo and del Angel, 2018; Wolters, 2000).  

The barriers between banks and financial institutions on the one hand, and retailers on the other, continued to remain fairly robust. Signs that this was beginning to change began to emerge in the 1980s, when retailers, such as Sears began to offer more complex financial products (Christiansen, 1987; Ghemawat, 1984; Raff and Temin, 1997). Yet, the more concerted activity by retailers to diversify into financial services, would ultimately be stimulated by food retailers (Martinelli and Sparks, 1999; Colgate and Alexander, 2002). The Hinky Dinky System however shows that a co-operative not just a competitive solution was a very real possibility.   

In 2021 we are witnessing an extreme extension and intensification of these trends. Throughout the ongoing Covid-19 pandemic, the use of cash has greatly declined as more and more people switch to digital payments. In the retail industry, even before the pandemic, POS innovations were becoming increasingly digital (Reinartz and Imschloβ, 2017) as retailers shifted toward a concierge model of helping customers rather than simply focusing on processing transactions and delivering products (Brynjolfsson et al., 2013). Consequently, the retail-customer interface was already starting to shift away from one that prioritised the minimisation of transaction and information costs toward an interface which prioritised customer engagement and experience (Reinartz et al., 2019).

A second feature of the pandemic has been the massive increase in interest in crypto currencies, in its many different forms, around the world. This is most apparent in the volatility and fluctuations in price of Bitcoin but is also evident in the increased prominence of alternative fiat currencies (such as Ether). Indeed, even central banks in Europe and North America are discussing digital currencies, the government of El Salvador has made Bitcoin legal tender, while the People’s Bank of China have launched their own digital currency in China. A further manifestation of the momentum crypto currencies are gaining include the private initiatives of big tech (such as Facebook’s Diem, formerly Libra). Yet, in spite of all of this latent promise, transactions at point of sale with crypto currencies are still minuscule and time and again, surveys by central banks on payment preferences consistently report people want paper money to continue to play its historic role.

It thus remains too early to forecast with any degree of certainty what the actual long-run effects of the virus, social distancing and lockdowns will have on the use of cash, how consumers acquire products and services, and what these products and services are. It is also uncertain whether and if greater use of crypto currencies will lead to a decentralised management of monetary policy (and if so, the rate at which this will take place). It is though almost certain that consumer’s behaviours, expectations and habits will have been altered by their personal experiences of Covid. In this context the story behind “Hinky Dinky” reminds us to be sober at a time of environmental turbulence and wary of extrapolating trends, to better understand the motivation driving the adoption of new payment technology as some of these trends, like “Hinky Dinky”, might look to have wide acceptance but to result in a short-term phenomenon.      

Acknowledgements

We appreciate helpful comments from Jeffrey Yost, Amanda Wick and J. Carles Maixé-Altés. As per usual, all shortcomings remain responsibilities of the authors.


Bibliography

Basker, E. (2012). Raising the Barcode Scanner: Technology and Productivity in the Retail Sector. American Economic Journal: Applied Economics4(3), 1-27.

Bátiz-Lazo, B. (2018). Cash and Dash: How ATMs and Computers Changed Banking. Oxford: Oxford University Press.

Bátiz-Lazo, B., Maixé-Altés, J. C., & Thomes, P. (2011). Technological Innovation in Retail Finance: International Historical Perspectives. London and New York: Routledge.

Bátiz-Lazo, B., Haigh, T., & Stearns, D. L. (2014). How the Future Shaped the Past: The Case of the Cashless Society. Enterprise & Society, 15(1), 103-131.

Bátiz-Lazo, B., & del Ángel, G. (2018). The Ascent of Plastic Money: International Adoption of the Bank Credit Card, 1950-1975. Business History Review, 92(3 (Autumn)), 509 - 533.

Bátiz-Lazo, B., & Wood, D. (2002). An Historical Appraisal of Information Technology in Commercial Banking. Electronic Markets - The International Journal of Electronic Commerce & Business Media, 12(3), 192-205.

Benaroch, M., & Kauffman, R. J. (2000). Justifying Electronic Banking Network Expansion Using Real Options Analysis. MIS quarterly, 197-225

Bucklin, L. P. (1980). Technological-change and Store Operations-The Supermarket Case. Journal of retailing56(1), 3-15.

Calder, L. (1999). Financing the American Dream: A Cultural History of Consumer Credit. Princeton: Princeton University Press.

Colgate, M., & Alexander, N. (2002). Benefits and Barriers of Product Augmentation: Retailers and Financial Services. Journal of Marketing Management18(1-2), 105-123.

Cortada, J. (2004). The Digital Hand, Vol 1: How Computers Changed the Work of American Manufacturing, Transportation, and Retail Industries. Oxford: Oxford University Press.

Chrstiansen, E. T. (1987) Sears, Roebuck & Company and the Retail Financial Services Industry (Part Two). Case 9-387-182. Cambridge, MA: Harvard Business School.

Doms, M. E., Jarmin, R. S., & Klimek, S. D. (2004). Information technology Investment and Firm Performance in US Retail Trade. Economics of Innovation and new Technology13(7), 595-613.

Emmet, B., Jeuck, J. E., & Rosenwald, E. G. (1950). Catalogues and Counters: A History of Sears, Roebuck and Company. Chicago: University of Chicago Press.

Ghemawat, P. (1984) Retail Financial Services Industry, 1984. Case 9-384-246, Cambridge, MA: Harvard Business School.

Hagley Museum and Archives Philadelphia Savings Fund Society Collection 2062 Box 13 POS Program Introduction March 2, 1978.

Howard, V. (2015). From Main Street to Mall: The Rise and Fall of the American Department Store. Philadelphia: University of Pennsylvania Press.

Jarmin, R. S., Klimek, S. D., & Miranda, J. (2009). The Role of Retail Chains: National, Regional and Industry results. In Producer dynamics: New evidence from micro data (pp. 237-262). Chicago: University of Chicago Press.

Lebhar, G. M. (1952). Chain stores in America, 1859-1950. New York: Chain Store Publishing Corporation.

Martinelli, E. and Sparks, L. (2003). Food Retailers and Financial Services in the UK: A Coopetitive Perspective", British Food Journal, Vol. 105 No. 9, pp. 577-590.

Maixé-Altés, J. C. (2012). Innovación y compromiso social. 60 años de informatización y crecimiento. Barcelona: "la Caixa" Group.

Maixé-Altés, J. C. (2019): "The Digitalisation of Banking: A New Perspective from the European Savings Banks Industry before the Internet," Enterprise and Society, Vol. 20, No. 1, pp. 159-198.

Maixé-Altés, J. C. (2020): "Retail Trade and Payment Innovations in the Digital Era: A Cross-Industry and Multi-Country Approach", Business History Vol. 62, No. 9, pp. 588-612.

Maixé-Altés, J. C. (2021): "Reliability and Security at the Dawn of Electronic Bank Transfers in the 1970s-1980s". Revista de Historia Industrial, Vol. 81, pp. 149-185.

Maixé-Altés, J. C. and Castro Balguer, R. (2015): "Structural Change in Peripheral European Markets. Spanish Grocery Retailing, 1950-2007", Journal of Macromarketing, Vol. 35, No. 4, pp. 448-465.

Raff, D., & Temin, P. (1999). Sears, Roebuck in the Twentieth Century: Competition, Complementarities, and the Problem of Wasting Assets. In Learning by doing in markets, firms, and countries (pp. 219-252). Chicago: University of Chicago Press.

Reinartz, W., & Imschloβ, M. (2017). From Point of Sale to Point of Need: How Digital Technology is Transforming Retailing. NIM Marketing Intelligence Review9(1), 42.

Reinartz, W., Wiegand, N., & Imschloβ, M. (2019). The Impact of Digital Transformation on the Retailing Value Chain. International Journal of Research in Marketing36(3), 350-366.

Ritzer, G. (2001). Explorations in the Sociology of Consumption: Fast Food, Credit Cards and Casinos. Thousand Oaks, CA: Sage Publishing.

Ritzer, J. (1984) Hinky Dinky Helped Spearhead POS, remote banking movement. Bank Systems and Equipment, December, 51-54.

Schuman, C. R. (1975). The Retail Merchants' Perspective Towards EFTS. Catholic University Law Review25(4), 823-842.

Sprague, R. E. (1977). Electronic Funds Transfer In Europe: Their Relevance for the United States. Savings Banks International, 3, 29-35.

Sprague, R.E. (1974) Electronic Funds Transfer System. The Status in Mid-1974 – Part 2. Computers and People 23(4).

Stearns, D. L. (2011). Electronic Value Exchange: Origins of the Visa Electronic Payment System. London: Springer-Verlag.

United States. National Commission on Electronic Funds Transfer. (1977). EFT and the public interest: a report of the National Commission on Electronic Fund Transfers. Second printing. Washington: National Commission on Electronic Fund Transfers.

Wolters, T. (2000). ‘Carry Your Credit in Your Pocket’: The Early History of the Credit Card at Bank of America and Chase Manhattan. Enterprise & Society, 1(2), 315-354.

 

Bátiz-Lazo, Bernardo and Buckley, Tom R. August (2021). “Early “Frictions” in the Transition towards Cashless Payments.” Interfaces: Essays and Reviews in Computing and Culture Vol. 2, Charles Babbage Institute, University of Minnesota, 64-75.


About the authors: 

Dr. Bernardo Bátiz Lazo is Professor of FinTech History and Global Trade at Northumbria University in Newcastle upon Tyne, UK and research professor at Universidad Anahuac (Mexico). He is a fellow of the Royal Historical Society and the Academy of Social Sciences. 

Dr. Tom Buckley is currently Lecturer in International Business Strategy at the University of Sheffield. Dr. Buckley received his PhD from the University of Reading’s Henley Business School in 2017. 


 

Top 10 Signs We Are Talking About IBM’s Corporate Culture

James W. Cortada, Senior Research Fellow, Charles Babbage Institute 

Abstract: Using the format of a late night TV humorist's way of discussing an issue, this article defines ten characteristics of IBM's corporate culture deemed core to the way IBM functioned in the twentieth century. The perspective is of the individual employee's behavior, conjuring up with humorous images what they thought of each other's behavior. However, corporate images are serious matters so as historians focus more attention on corporate cultures, such images will need to be understood. The list also demonstrates that one image of IBM personnel as being serious did not always reflect reality within the company.

(PDF version available for download.)

IBM education center in Endicott, NY.
The entrance staircase into an IBM education center in Endicott, NY  used from the 1930s through the 1980s. All employees were routinely required to undergo about two weeks of training each year. 

Between 1982 and 2015, comedian and American television host David M. Letterman hosted a highly popular late-night television comedy show, Late Night with David Letterman. One of his recurring presentations was the “Top 10 List,” which ranked from tenth to first features of an issue. These became wildly popular, were satirized, and taken to be comedic commentary on contemporary circumstances. He and his staff produced nearly 700 such lists, while Letterman hosted over 6,000 programs viewed by millions of people. He published four book-length collections of these lists. The genius of these lists, as in any good comedy, lay in providing a slightly fractured view of reality. On occasion, he commented on corporations, such as GE, AT&T, Exxon, McDonalds, QVC and Westinghouse among others.  

In recent years scholars have become increasingly interested in the history of corporate cultures. For decades historians said corporate cultures were important, but they did little work on the subject. That is beginning to change. I am exploring IBM’s corporate culture, which historians and employee memoirists all claim was central to the company’s ability to thrive and survive longer than any other firm in the IT industry. So, to do that historians have to find new types of documentation informing their study of the subject. That is why we are turning to comedian David Letterman. His lists reflected public interest in a topic, and they captured perspectives people would either agree with or that made sense to them given what they knew of the subject. What better way is there to see what IBM’s image looked like: “10 Signs you might be Working at IBM”:

 

10. You lecture the neighborhood kids selling lemonade on ways to improve their process.”

Thomas J. Watson Sr., head of IBM from 1914 to 1956, made exploring with customers how best to use his tabulators, then computers, a central feature of his selling method. Tens of thousands of sales personnel did this. In the process they and other employees improved IBM’s operations and were seen as experts on efficient, productive business operations. As the company grew in size, so too did its reputation for having “its act together” on all manner of business and technical issues. That reputation expanded, crowned with the acquisition of 30,000 management and IT consultants from PwC in 2002. There was no industry or large company immune from IBM’s views on how to improve operations, nor community organization involving IBMers. (Side note: Your author—retired IBMer—taught his daughters how to improve their Girl Scout cookie sales process.)

IBMers in suits
IBM salesmen dressed for the part.

“9. You get all excited when it’s Saturday so you can wear shorts to work.

The two things most people know about IBM in any decade is its THINK sign and that its employees always wore dark suits, white shirts, regimental ties, black socks and shoes. It turns out, they did that, not because someone at the top of the corporation mandated such a dress code, rather because that is how customers tended to dress. There was no official dress code, although most IBMers can tell a story of some lower level manager sending someone home to change out of their blue shirt or, shockingly out of their brown loafers. But Thomas J. Watson, Jr., CEO between 1956 and 1971 finally opined in the subject in 1971, confessing that IBM’s customers dressed in a “conservative” manner. He thought, “it is safe to say that the midstream of executive appearance is generally far behind the leading edge of fashion change.” So, “a salesman who dresses in a similar conservative style will offer little distraction from the main points of business discussion and will be more in tune with the thinking of the executive.” That is why, “we have always had a custom of conservative appearance in IBM.” People are thus asked, “to come to work appropriately dressed for their job environment.” That’s it: the smoking gun, the root source of the true IBM dress code policy! Millions of people have met IBM employees wearing blue suits. Sociologists point out that every profession has its uniform (e.g., university students blue jeans, cooks their tall white hats, IBMers their wingtips and white shirts). The Letterman quip inferred below is that IBM employees were willing to put in long hours on behalf of their company.

President Thomas J. Watson, Sr, here in the 1950s.
IBM President Thomas J. Watson, Sr, ca. 1950s.

8. You refer to the tomatoes grown in your garden as deliverables.

Deliverables is a word with a long history at IBM, at least back to the 1950s. While its exact origins are yet to be uncovered, technical writers worked with product developers on a suite of publications that needed to accompany all product announcements or modifications to them. In fact, one could not complete a product development plan without including as one of its deliverables a communication plan in writing. The plan had to include when and how press releases, General Information and user and maintenance manuals would be published, and how these would accompany products to customers’ data centers. When IBM entered the services business focusing largely on doing management and strategy consulting in the 1990s with the creation of the IBM Consulting Group use of the word expanded across IBM. Management consultants in such firms as Ernest & Young, Booze Allen and McKinsey, among others, always concluded their consulting projects with a final report on findings and recommendations, which they, too, called deliverables. So as these people came to work at IBM, they brought with them their language. By the end of the 1990s, it seemed everyone was using deliverables to explain their work products. So, the Letterman List got right another IBM cultural trait that proved so pervasive that even employees did not realize they said deliverables.

 

7. You find you really need Freelance to explain what you do for a living.

Oh, this one hurts if you are an IBMer. The “Ouch” is caused by the fact that during the second half of the twentieth century, employees attended meetings armed with carefully prepared slide presentations. They began with Kodak film slides, then 8 x 10 film called foils (IBMers may have had an exclusive in using that word), followed by the precursor of PowerPoint called Freelance. Every manager it seemed carried a short presentation about their organization, what it did and so forth. By the end of the 1960s it seemed no proposal to a customer or higher-level manager was missing the obligatory presentation. When Louis V. Gerstner came to IBM as its new CEO in 1993, he immediately noticed this behavior and essentially outlawed the practice by his executives when meeting with him. He wanted them to look into his eyes and talk about “their business.” Eventually he retired and so by the early 2000s, PowerPoint was back. By the 2010s the nearly two hundred firms acquired by IBM came fully stocked with employees who, too, clutched their slide presentations. Edward Tufte, the Yale professor who is most noted for his multi-decade criticism of PowerPoint presentations, must clearly have had IBM in mind, although he admitted many corporations suffered from similar behavior. He went on to study the role of graphics and presentation of statistics and other data.

 

6. You normally eat out of vending machines and at the most expensive restaurant in town within the same week.

This observation required true insight. One of the TV writers must have interviewed a salesman, consultant, any manager or executive to stumble across this one. IBM people spent a considerable amount of time traveling to visit customers, attend internal and industry meetings, fulfill their normal requirement of two weeks of training every year, attend conferences to make presentations, or to meet with government officials. By the 1980s it was not uncommon for large swaths of IBM to be organized in some matrix manner in which one’s immediate supervisor was in another country while yet another manager with whom one had to work with was perched elsewhere. To kiss the ring, one had to travel to wherever that manager held court. Some professions, such as sales and consulting and middle and senior management, turned themselves into tens of thousands of “road warriors.” So, one might fly to a city and take a customer out to dinner at a magnificent restaurant to build personal rapport and to conduct business, in slang terms sometimes referred to as “tavern marketing.” But then afterwards rushed to the airport to catch the “red eye” overnight flight home or to some other destination to attend yet another meeting. That would require possibly eating vending machine food after an airport’s restaurants closed, at a work location that had no restaurant or when there was no time to rush out for something. IBMers, too, prided themselves in making their flights “just in time,” meaning no time for having a leisurely meal. You were complimented if you reached the airplane’s door just as it was about to be closed.

 

5. You think that “progressing an action plan” and “calenderizing a project” are acceptable English phrases.

Since at least the 1970s, employees putting together those famous slide presentations were retreating from writing full sentences, engaged in the very bad habit of turning nouns into verbs. Technical writers in the firm eschewed such behavior, so too the media relations community. Employees working in headquarter jobs in the United States, were particularly notorious users of nouns. Letterman may not have known of the most widely used example, “to solution” something or its variant “I will solution that problem.” The use of a noun was intended to project force, action, determination, and leadership. Nobody seemed embarrassed by their ignorance of the English language. If one worked in the same building as hundreds or thousands of people without visiting too many other workplaces, local speech patterns became evident. The New York area’s IBM community was notorious; they wanted people to come to them and when that happened visitors were abused with such language. As cultural anthropologists pointed out since Claude Levi-Strauss as early as the 1930s, tribes form their own language tied to their cultures and lifestyles. IBMers were guilty of the same. It is part of the behavior that led to such usages as “foils.”

 

4. You know the people at faraway hotels better than your next-door neighbors.

This has to qualify as true for some road warriors. It ties to No. 6 about vending machines. Consultants, in particular, would leave home on a Sunday night or Monday morning and not return until Friday night, if on long term projects. They were commuting and so when home, took care of domestic chores or spent time with their families. It was—is—not uncommon for employees to know the names of flight attendants and hotel staff, since those individuals, too, had set work schedules. Knowing the name of restaurant staff working near a client’s offices was—is—not uncommon, either. Such knowledge could be exotic, as knowing the flight attendant assigned to one’s Monday morning flight to Orange County, California, and at the same time the doorman at one’s favorite Lebanese restaurant in Paris. This is not conceit, just the reality that IBM employees did a considerable amount of traveling in the course of their career. It was both an attraction and a burden. Travel made work interesting but also placed a burden on one’s Circadian body rhythms not helped by rich food or vending machine delights.

IBM awards dinner held in Endicott New York in the late 1940s
The company made sure families were also involved in IBM-sponsored events to strengthen the bonds and its corporate culture. This is an awards dinner held in Endicott, New York in the late 1940s.

3. You ask your friends to “think outside the box” when making Friday night plans.

In IBM’s culture solving problems is a practice shared by all employees almost all the time. It became a worldview, a framework, and an attitude toward activities in both their professional and private lives. It has been fostered within the firm since the 1910s, largely because the products it sold required addressing customer issues and others challenging the internal operations at IBM. Over the years, language and phrases emerged that were embraced to speak to that issue. Thinking outside the box spoke to the need often required to come up with a solution to a problem that had not been tried before. That behavior prized imagination and equally so, a reluctance to accept no as an answer to a request. For a century, for example, salesmen and engineers were taught when encountering an objection or a problem not to take it personally, but to decompose it to understand what it really is, and then come up with a “fix” for it. There was an age-old sales adage that helps here: “The selling doesn’t start until the customer says no.” Flipping a “no” into a “yes” requires “thinking outside the box.” The same mindset was applied in one’s private life too.

 

2. You think Einstein would have been more effective had he put his ideas into a matrix.

Someone must have had spies in IBM or was a business school professor of organizational theory, because by the 1960s, much of IBM was organized like a matrix. As one student of IBM’s culture with experience studying corporate structures explained: “I’ve never seen this in any other company,” adding, “with all those dotted lines and multiple bosses.” However, it worked because everyone subscribed to a common set of values and behaviors, and all had documented performance plans that stated explicitly what they were to do. Where one sat in IBM insured that in everyone’s slide presentations there would be an organization chart to which the speaker could point out to explain where they perched. Another observer opined that, “It is probably the most complex organization that I have seen,” enter the illusion to smart Albert Einstein. Hundreds of thousands of employees lived in such matrices and somehow it all worked, because IBM made money and profits, with a few exceptions that Letterman and his audience might not even have been aware of, since most stockholders were the rich and institutional investors.

Following Letterman’s practice: “And the Number One Sign That You Work at IBM” with a drum roll, of course:

 

1. … You think a ‘half-day’ means leaving at 5 o’clock.”

Employees had a work ethic that customers saw displayed in many ways: travel schedules, customer engineers working around the clock and over weekends to install and repair hardware and software, consultants who showed up at 8 in the morning and left at 7:30 to dine at one of those fine restaurants or to wolf down pizzas as they prepared for a client presentation to be made first thing in the morning. It was a life of endless dinners with clients and one’s management, or student teams working on case studies until midnight in some training program. Weekend planning retreats were all too common, especially in the fall as IBMers prepared for next year, or for spring reviews which were a ritual requiring weeks of preparation for when executive management would swoop in to inspect, often knowing as much about one’s business as the presenters. The company did nothing to hide the long hours its employees put in—it exemplified the wisdom of the Grand Bargain. This bargain held that in exchange for working loyally and to a great extent, one was assured a lifetime of employment at IBM. The 5 o’clock comment recognized that employees were seen as far more loyal to the firm and defender of its ways than evident in other companies. One sees such comments in bits and pieces in memoirs and accounts of the IT industry, but the Letterman list cleverly summed it up.

25 years of service image
For the most part for over eight decades employees found their employment with IBM a source of pride. Those with 25 years of “service” were considered an elite group, at least until the 1990s.

So, what was the image the Letterman List portrayed of IBMers to millions of people? While many had a good laugh, it affirmed that IBM’s employees were serious, knowledgeable, seemingly always on duty (even at home and in their neighborhood), focused on results, were imaginative, and had their own ways of doing and talking. IBM had purposefully worked on developing that image since the 1910s and a century later still retained it. It was part of a larger, hardly discussed, corporate strategy of creating an information/business ecosystem in the world of IT which it dominated.

But, of course, what Letterman may have missed are so many other lists, such as those hundreds of line items defining IBM (e.g., I’ve Been Moved). IBMers did not sleep wearing their black wingtip shoes, nor cut their lawns wearing white shirts. They actually had a sense of humor as historians are beginning to discover. IBMers conjured up comedic skits across the entire company around the world. They did standup joke telling, and, of course, sang songs, often with lyrics tailored to some Letterman-like observations about IBM.

But here is the punch line. Letterman never drew up this list, it is a spoof, prepared by an IBMer that circulated around the Internet. It was probably written in 1997, while IBM’s old culture was still much in place, when what was said here were IBM employee insights into the company’s culture. In short, there is more accuracy in this list than the comedian could have conjured up. But it was done so well that you have to admit, you believed it.

On a more serious note corporate image is an important issue. Today, for example, Facebook is being criticized for being irresponsible in supporting the flow of accurate information through society. It must have some employees who cringe that the driveway into their corporate headquarters is named Hacker Way, which suggests this is a company with teenage-like behavior when now it has become an important component of modern society. IBM studiously avoided such traps. Amazon, which enjoyed a positive image for years, recently was criticized for its working conditions that led to an attempted unionization effort at an Alabama facility, highlighting its aggressive actions to crush the initiative. President Joe Biden even supported publicly the unionizing effort. IBM never unionized in the United States, it never had a counter-culture name for a road and it never spoke about breaking things, rather about building them. From the beginning it wanted to be seen as a firm bigger than it was and as a serious, responsible pillar of society. Today’s business titans have much to learn from IBM’s experience.

One would wonder how Letterman would treat Apple, Microsoft, Facebook, Cisco, Amazon, Verizon, or Disney? He poked fun at other companies and, at least within IBM when he was popular on television, employees came up with their own Top 10 Lists all the time. If these other companies would be embarrassed by the humor, it suggests that Letterman has some business management lessons to teach them too.  


Bibliography

Cortada, James W. (2019).  The Rise and Fall and Reinvention of a Global Icon.  Cambridge, Mass.: MIT Press.

Pugh, Emerson W. (1995).  Building IBM: Shaping an Industry and Its Technology.  Cambridge, Mass.: MIT Press.

Watson, Thomas J., Jr. (1963, 2000). A Business and Its Beliefs: The Ideas that Helped Build IBM.  New York: McGraw-Hill.

___________ and Peter Petre. (1990).  Father, Son, and Co.: My Life at IBM and Beyond. New York: Bantam.

 

Cortada, James W. (July 2021). “Top 10 Signs We Are Talking About IBM’s Corporate Culture.Interfaces: Essays and Reviews in Computing and Culture Vol. 2, Charles Babbage Institute, University of Minnesota, 55-63.


About the authorJames W. Cortada is a Senior Research Fellow at the Charles Babbage Institute, University of Minnesota—Twin Cities. He conducts research on the history of information and computing in business. He is the author of IBM: The Rise and Fall and Reinvention of a Global Icon (MIT Press, 2019). He is currently conducting research on the role of information ecosystems and infrastructures.


 

NFTs, Digital Scarcity, and the Computational Aura

Annette Vee, University of Pittsburgh

Abstract: Here, I draw on Walter Benjamin’s discussion of the aura of original art in “The Work of Art in the Age of Mechanical Reproduction” to explore the appeal of NFTs (non-fungible tokens) in the age of digital reproduction. I explain what NFTs on the blockchain are and point to other attempts at scarcity in digital contexts, including Cryptokitties and the Wu-Tang Clan’s Once Upon a Time in Shaolin. Just as Bitcoin emerged from the 2008 financial crisis, NFTs have gained traction in the Covid-19 pandemic, demonstrating that scarce, rivalrous, positional goods are desirable even when computational networks afford perfect replication at scale.

(PDF version available for download.)

*Please note: Explicit language quoted in this article may be offensive to some readers.

Beeple's Everydays: The First 5000 Days
Figure 1: Screenshot of Christie's page showing the final price and image of Beeple's Everydays: The First 5000 Days, https://onlineonly.christies.com/s/beeple-first-5000-days/beeple-b-1981-1/112924

 

“holy fuck.” Beeple tweeted on 10:42AM Mar 11, 2021, when his artwork “Everydays: The First 5000 Days,” a jpg file measuring 21,069 pixels square, sold for $69,346,250 at auction on Christie’s online. Holy fuck, indeed: the first all-digital artwork sold at Christie’s—a composite of edgy, meme-worthy images the artist had posted every day since 2007--fetched a price in the same league as works by van Gogh, Picasso, Rothko and Warhol. In 1987, another Christie’s auction made headlines: Vincent van Gogh’s Still Life: Vase with Fifteen Sunflowers (“Sunflowers”) sold for nearly $40 million, tripling the record from any previous sale of art. Putting aside comparative judgements of quality, the nearly $70 million for Everydays was a lot of money to pay for a piece of art that could be perfectly replicated and stored on any given laptop. What made Everydays more like Sunflowers than millions of other jpgs?

Everydays was minted on 16 February 2021 and assigned a non-fungible token (NFT) on the Ethereum blockchain. This NFT authenticates Everydays and makes it unique from another bit-for-bit copy of the same file. Where a van Gogh listing on Christie’s site might declare medium, date, and location (e.g, oil on canvas, 1888, Arles), Everydays lists pixel dimensions, a token ID, wallet address, and smart contract address. In a Special Note on the Everydays auction, Christie’s declared it would accept the cryptocurrency Ether, but only in digital wallets hosted by a select group of platforms. Implicit in the listing of these addresses, the token ID, and trusted platforms is an attempt at digital authenticity.

Digital reproduction enables exact copies of art. Even when artists employ watermarks to encourage payment for digital art, the same tools that make the images and mark them can be used to restore and replicate them. NFTs secure digital art not by changing the file itself, but by changing its provenance. NFTs attach a unique identifier, or token, to represent the art on the blockchain. Their non-fungibility differentiates them from cryptocurrency relying on the blockchain. Bitcoin or Ether, for example, are fungible: any Bitcoin spends the same as any other Bitcoin. And like the fiat currency of the dollar, Bitcoin can be spent anywhere that particular cryptocurrency is accepted. In contrast, a non-fungible token is intentionally unique and cannot be spent. But because any given NFT or cryptocoin has a unique position on the blockchain, they cannot be counterfeited. Blockchain security relies on long chains of transactions, each dependent on the previous transaction, with the entire series of transactions made public. Altering one transaction would require altering the copy of the record decentrally stored across thousands of machines simultaneously. In other words, it's effectively impossible.

When a digital piece of art has an NFT associated with it, it’s been marked as authentic by the artist or an agent with the power to authenticate it. While the digital art itself might be able to be reproduced, the version that’s on auction ostensibly has the imprimatur of the creator. It’s been digitally touched by the artist. You can, too, own an exact copy of Everydays by downloading it here. But you can’t be part of the transaction that was recorded on the Ethereum blockchain, which involved Beeple transferring the token to the winner of the auction. (For more details on the exact chain of transactions from digital file to blockchain to auction and purchase, Robert Graham offers a more technical breakdown.)

Renaissance artists such as Leonardo da Vinci painted in a studio with the help of assistants. What makes a da Vinci a da Vinci isn’t that he painted all of it, but that he painted at least some of it, and that a community of experts take some responsibility for the claim that (Langmead et al.). We can compare a da Vinci painting to Jack Dorsey’s first tweet, which has been reproduced everywhere but is now associated with an NFT (Boucher). It’s like having Jack Dorsey touch the tweet before selling it, adding his splash of paint. The scarcity is what makes it valuable; the NFT buyer owns something that others do not and cannot. One of the reasons that Salvator Mundi sold for $450 million—shattering all previous records for art auctions—is that it is one of only 20 paintings attributed to da Vinci (Langmead, et al.). Dead artists generally fetch more for their work than living artists because they aren’t making any more art (Jeff Koons holds the living artist record for his metal sculpture Rabbit, sold at 91.1 million in 2019). For NFTs as well as physical art, scarcity depends on human trust in the creator as well as the system that verifies its connection with the creator.

Still Life: Vase with Fifteen Sunflowers
Figure 2: Still Life: Vase with Fifteen Sunflowers, by artist Steven Vee.

I own a version of Still Life: Vase with Fifteen Sunflowers, which hangs on a wall in my family room. This Sunflowers is an original piece of art, has a traceable provenance, and is beautiful. It has an aura just as the one that auctioned for nearly $40 million in 1987. But the reason my copy wouldn’t fetch the same price at auction (though I admit I haven’t tried) is that the artist is Steven Vee--my dad. His paintings are highly valued in the diaspora of my hometown but are unknown to the van Gogh connoisseurs who bid at Christie’s. There’s the matter of the work’s age (15 years vs. 100 years) and materials (acrylic vs. oil paints). But the main difference between the two pieces of art is their aura: who imbued them with the aura, how they painted them, where, and who has owned them. My Sunflowers is valuable to me, but probably not to Christie's. (Although if it were, I would let it go for a mere $20million—sorry, Dad.)

Digital scarcity

Scarcity is a default attribute for a physical piece of art: both Vincent van Gogh's Sunflowers and Steve Vee's Sunflowers have multiple versions, but each individual painting is unique. Artificial scarcity has been the primary solution to the aura problem for mechanical reproduction of art. Limited print runs can ensure that a collector has one of only 20 prints, even if it’s technically possible to produce hundreds of them. Although the print may not be directly touched by the creator, its scarcity gives it value.

But scarcity is tricky with digital work. The fact that digital files are perfectly and infinitely reproducible makes it difficult to limit copies, at least once a digital file is released to another party. Perfect replication is one of the advantages of digitality, but it works against exclusive ownership. NFTs are a solution to this problem, but there have been others, each specific to its digital and social context.

In virtual spaces, scarcity emulates physical spaces. In the virtual world of Second Life, which was popular in the early 2000s and had a GDP to rival Samoa in 2009, users can build and buy property (Fleming). While the number of islands on which to build is theoretically infinite, the particular island and construction is ensured to be unique because the world is hosted by one company, Linden Lab. In Second Life, particular goods and construction could be copied and were the subject of intellectual property debates and court cases. And property ownership is subject to Linden Lab’s continued management and discretion.

Second Life
Figure 3: Launched in 2003, Second Life is still around. Screenshot from https://secondlife.com/

In high-stakes online poker in the early 2000s, big-time players sold expensive coaching manuals as pdf files, and protected their scarcity by introducing a small variation in the version—an extra comma on page 34, for instance (Laquintano). It’s very easy to circulate a pdf: pdfs have a small file size and are easily stored and read on default programs on consumer machines. But anyone paying that much for a pdf wants security in knowing that their pdf won’t circulate easily, especially since the manuals contained poker strategies, so, like limited print runs, the pdfs lose value the more widely they are held. If the pdf manual got out, an author could trace the particular variation back to the original buyer and enforce the sales contract with social consequences in the poker community.  

Another tactic to make a digital work scarce is to keep it out of digital networks altogether. This was Cilvaringz and RZA’s tactic with the Wu-Tang Clan’s 2015 album Once Upon a Time in Shaolin. Just one copy of the album was pressed, put in a bejeweled silver box with leather-bound liner notes, then put up for auction, where it fetched $2 million. The book accompanying the album says specifically: “This book has not been catalogued with the Library of Congress." The scarcity was ensured through the singular pressing as well as an agreement forbidding the buyer to exploit it commercially for 88 years (though it didn't curtail free distribution). The buyer later turned out to be the infamous Martin “Pharmabro” Shkreli, the album was seized by the US government in trial, and Shkreli finally streamed it in celebration of Trump’s victory in 2016.

With Once Upon a Time in Shaolin, the Wu-Tang Clan recaptured the aura of original art in digital music. Cilvaringz and RZA were frustrated at the devaluing of music through pirating and streaming and sought to make Once Upon a Time in Shaolin an art object. In an interview with Rolling Stone, RZA said, "It's kind of crazy. The record has become an entity, very different from a lot of albums. It's like the Mona Lisa. It's got its own folklore, and that's what me and [co-producer] Cilvaringz wanted." Speaking of digitized music, RZA said, “OK, nobody don't see the value on it, and we gonna put a value on it. We wanna say, 'This is what we think it's worth'" (Grow). On the album’s website, they channel Benjamin’s description of the aura in the age of mechanical reproduction:

Mass replication has fundamentally changed the way we view a recorded piece of music, while digital universality and vanishing physicality have broken our emotional bond with a piece of music as an artwork and a deeply personal treasure. […] We hope to inspire and intensify urgent debate about the future of music, both economically and in how our generation experiences it. We hope to steer those debates toward more radical solutions and provoke questions about the value and perception of music as a work of art in today’s world.

While a singular, high-profile release of an album might not be a general solution to the aura problem of digital art, it certainly worked for the Wu-Tang Clan.

Once upon a time
Figure 4: Screenshot from the official website of Once Upon a Time in Shaolin, http://scluzay.com/.

Enter the Blockchain

Blockchain technology enables new approaches to artificial scarcity for digital work. The protocol for the blockchain was specified with the release of a white paper describing Bitcoin currency by Satoshi Nakomoto (a pseudonym) in 2008, at the height of the financial crisis. The protocol Nakomoto described took care of several problems with cashless digital transactions, including authorization, privacy, and double-spending. Prior to the Nakomoto white paper, it was only possible to check these boxes with the help of a trusted financial institution. Another problem with previous attempts at digital currency was uptake, or literal buy-in. In 2008, with trust in these institutions at a nadir, Bitcoin was a revelation (Brunton). People were ready to trust a new computational protocol.

Blockchain is essentially a ledger of transactions, with each transaction occupying a unique position on the ledger’s chain of records. The ledger is recorded not centrally in a bank’s records, but decentrally, on the computers of the participants. The ledger of record is determined by consensus and influenced by who carries the record of the longest chain of transactions. So it’s not possible for an interloper to drop in and change the consensus ledger, unless they somehow control 51% of the participating recorders.

Transactions are grouped into blocks to be verified by participants, who must crack a complex computational problem to verify the block. This process is called “proof of work” because the problem requires a huge amount of computational brute force to solve. Participants, called "miners" because they are mining for computational solutions, are incentivized to verify blocks with a chance to earn cryptocurrency if they are the first to crack the problem. The enticing incentive to verify transactions along with the huge expenditure of resources required to solve the problem—which is intentionally and arbitrarily difficult—is why cryptocurrency is so environmentally destructive. Computation requires energy, and millions of competing processors dedicated to solving an intentionally difficult problem adds up to a lot of energy expenditure.

To combat the wasted energy problem, “proof of stake” is a newer alternative to “proof of work” for block verification. Proof of stake effectively lets people bet their assets on their verification. Someone would need to control 51% of any cryptocurrency in order to defraud the ledger. For a cryptocurrency like Algorand, which uses proof of stake, the use of arbitrary and secret selection of block certifiers makes it especially difficult to corrupt enough users to defraud the ledger. Proof of stake routes around brute force computation and thus some of the environmental destruction of blockchain. Rather than favoring the biggest processors, it favors the biggest accounts.

The auras of assets other than money

Any currency is just an abstract representation of value, and so after the Bitcoin white paper, it didn’t take long to figure out how to put assets other than currency on the blockchain. Through smart contracts, the Ethereum protocol enabled property and digital art to record value on the blockchain. A smart contract is a block of code that can automatically execute the terms of a contract. For instance, a smart contract can enable cryptocurrency to be exchanged in response to a triggering event like a digital file transfer and then record the transaction on a blockchain. To integrate non-digital information as an event—say, a death in the case of a will—a smart contract relies on a trusted “oracle” such as a newspaper record (or an oracle network such as Chainlink) to convert that information and trigger a distribution of assets. Many cryptocurrency protocols now include code execution, along with scripting languages and other infrastructure, to enable such smart contracts. Ethereum is the most popular of these protocols for NFTs. 

Decentraland is a more recent take on a virtual world like Second Life, with real property rights in virtual spaces, but instead of centralized control such as in Second Life, Decentraland is controlled by users and the technology of the DAO (Decentralized Autonomous Organization) made possible on the blockchain. Decentraland ties virtual land purchases to NFTs recorded on the Ethereum blockchain.

 

Cryptokitties
Figure 5: Screenshot demonstrating that Cryptokitties have varying exchange values in ETH, the Ethereum cryptocurrency, https://www.cryptokitties.co/catalogue

In 2017, the site Cryptokitties launched virtual collectible kitties that were registered as NFTs on the Ethereum blockchain. Cryptokitties emulate the artificial scarcity of baseball cards and Pokémon, but use the NFT protocol instead of physical cards to ensure that scarcity. Cryptokitties have a unique slate of traits (“Cattributes”), are released in generations, and can be bred to make more kitties. Cryptokitties are cute and silly, but they are actual assets on the blockchain and have actual (though widely varying) value in Ether (Ethereum’s cryptocurrency). The developers of Cryptokitties set out with a goal of introducing new users to cryptocurrency. Cryptokitties are, then, cute missionaries for a new financial order.

Cryptokitties have as much value as any collectible item like Beanie Babies and Pokémon cards. The developers, who refer to Cryptokitties as a kind of game, write,

Users spend 10-100x more on NFT assets than typical “in-game” digital assets because NFTs guarantee authenticity, scarcity, durability, and true ownership, which means NFTs have something very few digital components currently have: tradable value outside the ecosystem in which they were created. As an owner, I can sell my NFT assets at any given time – or I can keep them forever, passing them on like a family heirloom, from one generation to the next.

Gen Z+1 is certainly looking forward to receiving their cedar chest/Ethereum wallets of Diamond-gene Kitties.

NBA’s Top Shot is directly emulating the marketplace of sports cards through NFTs of “Moments” collected into Packs. Top Shot Moments give you the opportunity to “own your fandom.” The moments are “limited edition guaranteed by the blockchain” and are classified as Common, Rare and Legendary. In March 2021, Packs vary from $9 (Common) to $999 (Legendary). Two million Moments have been sold in the Marketplace, and the Legendary Steph Curry Dec 15, 2019 jumpshot, NY v. Cali, #1/4000 edition, is currently priced at $3,333. Anyone could find a clip of this jump shot on YouTube, and so the NBA has designed the Top Shot Moments to be more sophisticated than clips—they’re slick virtual cubes displaying stats along with video. But what distinguishes Top Shots from video clips is the artificial scarcity of exclusive ownership "guaranteed by the blockchain."

TopShot Moments
Figure 6: Screenshot from NBA TopShot, showing the TopShot Moments virtual cubes, https://nbatopshot.com/about

Cryptokitties and Top Shot Moments are infusing value into bits by making them scarce and capitalizing on the human propensity to hoard scarce goods. Artist Kevin Abosch calls NFTs “layers of hexadecimal code, alphanumerical proxies to distill emotional value” (Schachter). The blockchain platform has made waves in its creation of speculative capital, but with NFTs it creates emotional value as well. The blockchain registers auras.

Help someone stole my Internet tulips

The current bubble of speculation on blockchain based assets is part of the excitement. What NFTs are worth is determined by what others think they are worth—which is the same as in the physical art market, too. And the same for countless other speculative bubbles such as the Dutch tulip bubble in 1636 and the South Sea bubble in 1720, which caused Pope, Swift and many others to lose money and prompted Daniel DeFoe to call finance “Air-Money.” As Gayle Rogers details in a new book, Speculation, the risks and excitement of finance and investment have a long history. With wild speculation comes fraud and theft. As in the 2008 housing bubble, if the value of an asset keeps increasing, the incentive to verify it decreases. Buy in, cash out before the bubble bursts.

The information an NFT stores on the blockchain is effectively a link to the artwork and its metadata about the artist and provenance. The blockchain information might be immutable, but what it points to can be unstable, as Jonty Wareing recently pointed out. Artists such as Beeple and Grimes use IPFS (Interplanetary File System) to host this information. More secure than a URL, which is generally reliant on one host, the IPFS allows multiple hosts for content. But with hosting distributed, no particular host has a responsibility to keep the files online (Kastrenakes). And when files go offline, there is no verification for the NFT. CheckMyNFT, a site spun up just recently in response to the NFT craze, checks whether NFTs are hosted and verified. The site has found that even high-priced NFTs by high-profile artists go offline regularly. And if the blockchain register points to an empty address, an NFT is merely an Air-Asset. Ownership of the NFT then boils down to a digital paper trail of provenance, effectively ending at: “Trust me, I own these bits.”

An artist can theoretically "touch" multiple versions of the same piece of art—that is, mint multiple NFTs for identical pieces. That's because NFTs do not confer copyrights and there's no such thing as an "original" digital file. Since there's nothing stopping an artist from minting multiple NFTs for the same item, again, it’s trust that makes an NFT valuable. Also, without a central certification system for artists and agents, there's not much to prevent someone from fraudulently claiming to represent an artist and issuing NFTs. So, an NFT's value is contingent on whether the community of potential buyers trusts the artist, agent, and the art's authentic connection to both.

Outright theft of NFTs is also perhaps a greater risk than it is for physical assets. You just need someone’s password to their digital wallet. And as the digital security community often says, the weakest link in the chain is always the human. Passwords can be easy to crack or inadvertently shared. The security of any digital wallet depends on the security of its hosting service as well. In its auction of Everydays, Christie’s wisely specified which wallet hosting services it would accept.

 

Special
Figure 7: Screenshot of Christie's note specifying that the cryptocurrency Ether would be accepted for payment, but only as held by certain digital wallet hosts. https://onlineonly.christies.com/s/beeple-first-5000-days/beeple-b-1981-1/112924

As Michael Miraflor discovered when his NFTs were stolen on Nifty Gateway, the “trustless” virtue of the blockchain means human coordination is less necessary, but it also means that coordination and trust is unavailable when the system encounters problems like theft. Whoever has the NFTs owns the NFTs, regardless of how they got them. When his NFTs were stolen, Miraflor was able to get charges reversed on his credit card; however, he could not recover the stolen NFTs because transactions on the blockchain cannot be reversed by design. So the NFTs now belong to someone else, and the system is working as intended (@phantsy_prantz). Many commenters on Miraflor's Twitter thread were unsympathetic about the theft of virtual assets. @HeadlightsGoUp alluded to Miraflor’s participation in an NFT bubble: “Help someone stole my Internet tulips.”

And just as an owner of a physical artwork can burn it or throw it away, so can owners of NFTs. Ox_b1, “a pseudo anonymous crypto whale” who has over $500 million in crypto assets, bought a piece of NFT art by Lindsay Lohan for $43K in Ether and asked a community what to do with the asset. They voted to burn it. So, Ox_b1 transferred the NFT to a burn address: 0x00000. And poof, it’s gone. “NFT-dom is not all bad!” artist Kenny Schachter declared. Nice to watch celebrities get their just desserts, although Lohan still profited from the sale. 

And what about the most valuable NFT art of all, Everydays? If you would like to download your very own identical copy of it, you can here on IPFS Gateway, using the hash of the file. The hash is a mathematical output of an algorithm run on a digital file. Different digital files, even if they’re visually similar, will produce different hashes. Thus, a hash is a way of verifying a file or an exact copy of a file. If you download that exact copy of Everydays, you’ll even have the right hash. You won’t own the copyright to it, but neither does Metakovan, who paid $69 million for it. Beeple, just like any other artist selling their work, retains copyright of a work upon its sale unless otherwise stated.

So, what Metakovan purchased for nearly $69 million isn’t the art or the copyright, but a place on an immutable, public transaction record—and a lot of media coverage. As Kal Raustiala and Christopher Jon Sprigman point out, Everydays is a “virtual Veblen good,” or a kind of good one purchases because it is expensive. Thorsten Veblen described the phenomenon of “conspicuous consumption,” especially among the newly wealthy. The value of the good is the social status it confers. Metakovan, the purchaser of Everydays, was behind a Beeple museum on the virtual world Cryptovoxels. As art critic Ben Davis describes, the museum isn’t a particularly good place to show Beeple’s artwork—but it is instead a platform for buying B.20 tokens, or partial shares in the Beeple collection. Beeple’s NFT artwork purchased by crypto-speculators serves, then, as a portfolio for further crypto speculation. We can all get in on the virtual ground floor.

NFTs rely on a chain of social valuation tied to digital scarcity, and that scarcity contributes to value. But the supposed scarcity of NFTs relies on a chain of humans, massive computational expenditure, file hosts, trust in digital registers, and speculation on social value. Which means their value is dependent on people, art, finance, and digital security all at once. No wonder many call NFTs a “house of cards.”

NFTs
Figure 8: “NFTs are dumb. Please go outside, do drugs, & have sex like normal” Wilmington, DE, April 2021. Photo by Karl Stomberg. Art by @vivideman https://twitter.com/KFosterStomberg/status/1388209346648084483/photo/1. Used with permission. 

NFTs as a novel nexus of art and value

Like Bitcoin, NFTs arose to prominence in a crisis, and both have been both a reflection and a response to the specific nature of that crisis. As during the banking crisis of 2008, during the Covid-19 pandemic of 2020-21, the ground is primed for radical rethinking of value that RZA and Cilvaringz asked for. We’re all jolted out of our routines, physical trading and transport of goods is limited, cryptocurrency has gained a foothold in mainstream finance, and we’re bolted to our computers with only virtual materials as a means of creation. Out of this context, NFTs evolved a novel nexus of art and value.

But while this particular nexus might be new, attempts at value and scarcity with new technological platforms is not. Manufacturing, film, and photography prompted Benjamin to consider how artificial scarcity contributed to the aura of art. That we now instill emotional value in digital art, blockchain registers, an Interplanetary File System, and non-fungible tokens is another reminder that technology echoes humanity. We have created scarce, rivalrous, positional goods even when replication and scale are key affordances of computation.

Acknowledgments: Thank you to Alexandria Lockett for inspiring this piece by pointing me to the Beeple sale and asking my thoughts. I am also grateful to Tim Laquintano, Alison Langmead, and Gayle Rogers for feedback on the draft in process and to Steve Vee, @vivideman, and Karl Stomberg for letting me feature their art. Thanks also to Jeffrey Yost and Amanda Wick for launching an open access platform for this kind of work, as well as helpful comments on the draft.


Bibliography

@beeple. “holy fuck.” Twitter, 11 Mar. 2021, 10:42 a.m., https://twitter.com/beeple/status/1370037462085595137.

Benjamin, Walter. (1968). “The Work of Art in the Age of Mechanical Reproduction.” Illuminations, edited by Hannah Arendt, Fontana, pp. 214–18.

Boucher, Brian. (2021). “Twitter Founder Jack Dorsey Is Auctioning Off the World’s First Ever Tweet as an NFT—and the High Bid Is Already $2.5 Million.” Artnet, March 2021. https://news.artnet.com/market/jack-dorsey-nft-tweet-1950279.

Brunton, Finn. (2019). Digital Cash: The Unknown History of the Anarchists, Utopians, and Technologists Who Created Cryptocurrency. Princeton University Press.

“Check My NFT.” Check My NFT, https://checkmynft.com/. Accessed 9 Apr. 2021.

“Cryptokitties.” Cryptokitties, https://www.cryptokitties.co/. Accessed 9 Apr. 2021.

Davis, Ben. (2021). “I Visited the Digital Beeple Art Museum and All I Got Was an Aggressive Pitch for My Money.” Artnet, March 25,2021. https://news.artnet.com/opinion/beeple-b-20Museum review-1954174.

“Decentraland.” Decentraland, https://decentraland.org/. Accessed 9 Apr. 2021.

Fleming, Nic. (2009). “Virtual World Theft Heads to Real Life Court.” Computer Weeklyhttps://www.computerweekly.com/news/1280090966/Virtual-world-theft-heads-to-reallifecourt.

Graham, Robert. (March 20, 2021). “Deconstructing that $69million NFT.” Security Boulevard, https://securityboulevard.com/2021/03/deconstructing-that-69million-nft/.

Grow, Kory. (2018). “RZA Wanted to Buy Martin Shkreli’s Wu-Tang Album Back for Himself.” Rolling Stone, https://www.rollingstone.com/music/features/rza-talks-martin-shkrelshaolin-wu-tang-album-w518574.

@HeadlightsGoUp. “Help someone stole my internet tulips.” Twitter, 15 March 2021, 11:05 a.m., https://twitter.com/HeadlightsGoUp/status/1371477584781979650.

@jonty. “Out of curiosity I dug into how NFT's actually reference the media you're "buying" and my eyebrows are now orbiting the moon.” Twitter, 17 March 2021, 8:30 a.m. https://twitter.com/jonty/status/1372163423446917122

@kennyschac. Someone bought #lindsaylohan nft and destroyed it. C’mon you must admit NFTdom is not all bad!” Twitter, 11 Mar 2021, 8:40 a.m. https://twitter.com/kennyschac/status/1370006659096064007.

Kastrenakes, Jacob. (2021). “Your Million Dollar NFT Can Break Tomorrow If You’re Not Careful.” The Verge, March 2021. https://www.theverge.com/2021/3/25/22349242/nftmetadataexplainedart-crypto-urls-links ipfs.

Langmead, Alison, et al. (2021). “Leonardo, Morelli, and the Computational Mirror.” Digital Humanities Quarterly, vol. 15, no. 1, http://www.digitalhumanities.org/dhq/vol/15/1/000540/000540.html.

Laquintano, Timothy. (2016). Mass Authorship and the Rise of Self-Publishing. University of Iowa Press.

@michaelmiraflor. “Someone stole my NFTs today on @niftygateway and purchased $10K++ worth of today's drop without my knowledge. NFTs were then transferred to another account. I encourage EVERYONE to please check their accounts ASAP. Could use everyone's help here please RT!” Twitter, 14 March 2021, 4:39 p.m., https://twitter.com/michaelmiraflor/status/1371199359996456960.    

“NBA Top Shot.” NBA Top Shot, https://nbatopshot.com/. Accessed 9 Apr. 2021.

“Online Auction 20447.” (March 11, 2021). Christie’s, https://onlineonly.christies.com/s/beeplefirst5000-days/beeple-b-1981-1/112924.

@phantsy_prantz. “it's literally impossible by design you clearly don't understand the technology very well by the terms of ‘ownership’ of an NFT, the new owner is the only owner of those tokens, regardless of anyone's feelings this is the system working as intended.” 17 Mar., 2021, 1:17 p.m., https://twitter.com/phantsy_prantz/status/1371510963405529090.

Rogers, Gayle (2021). Speculation: A Cultural History from Aristotle to AI. Columbia University Press.

Raustiala, Kal and Christopher Jon Sprigman (2021). “The One Redeeming Quality of NFTs Might Not Even Exist.” Slate, April 14, 2021. https://slate.com/technology/2021/04/nftsdigital-art-authenticity-problem.html.

Schachter, Kenny. (2021). “Are NFTs the Next Tulip Bubble? Kenny Schachter Doesn’t Car and He Sold His Own Grandma on the Crypto Web to Prove It.” Artnet, Mar. 2021. https://news.artnet.com/opinion/kenny-schachter-on-nfts-continued-1950407.

“Second Life.” Second Life, https://secondlife.com/. Accessed 9 Apr. 2021.

 

Vee, Annette. (June 2021). “NFTs, Digital Scarcity, and the Computational Aura.Interfaces: Essays and Reviews in Computing and Culture Vol. 2, Charles Babbage Institute, University of Minnesota, 38-54.


About the author:

Annette Vee is Associate Professor of English and Director of the Composition Program, where she teaches undergraduate and graduate courses in writing, digital composition, materiality, and literacy. Her teaching, research and service all dwell at the intersections between computation and writing. She is the author of Coding Literacy (MIT Press, 2017), which demonstrates how the theoretical tools of literacy can help us understand computer programming in its historical, social and conceptual contexts.


 

Everyday Information Studies: The Case of Deciding Where to Live

Melissa G. Ocepek and William Aspray

Abstract: This essay introduces everyday information studies to historians of computing. This topic falls within the subdiscipline of information behavior, one of the main subject areas in information studies. We use our recent edited book, on Deciding Where to Live (Rowman & Littlefield, 2021), as a means to illustrate the kinds of topics addressed and methods used in everyday information studies. We also point the reader to some other leading examples of scholarship in this field and to two books that present an overview of the study of information behavior.

(PDF version available for download.)

This essay introduces everyday information studies to historians of computing. The story of this field of study and its history are too large to tell in detail here. This topic falls within the subdiscipline of information behavior, one of the main subject areas in information studies – a field that began to be studied between the two world wars and took off in the 1960s. The reader interested in information behavior studies more generally should examine two well-regarded reference works on this subject (Case and Given 2016; Fisher, Erdelez, and McKechnie 2005).

Information Study Approaches

The early research on information behavior focused on human behavior in structured information environments, such as when a person went to a library to seek information or interacted with a database. But, of course, there were other, less structured environments for finding information, such as through conversations with friends and family; consulting religious or civic leaders, or area specialists such as financial advisors; and through consumption of the media. With the coming of the Internet and portable information devices, one could seek information anywhere, anytime, on any subject profound or frivolous. Information seeking, consumption, and analysis became an increasingly everyday part of ordinary people’s lives. The field expanded over time to not only include information needs, wants, and seeking, but also information avoidance and overload, and various kinds of affective as well as cognitive responses to information.

In fact, the everyday aspects of information were studied not only by information scholars but also by sociologists, communications scholars, and media scholars beginning as early as the 1930s. These studies about the roles information plays in one’s everyday life draw upon theorizing by such scholars as Michel de Certeau (1984), Henri Lefebvre (2008/1947), Dorothy E. Smith (1987), and Carolyn Steedman (1987). For an overview of the relevant theorizing, see Highmore (2001), Bakardjieva (2005, Chs. 1 and 2), and Haythornthwaite and Wellman (2002). Highmore also includes writing selections from many of these theorists. To make this introduction to everyday information studies more manageable, we focus here on our own work and primarily on our recent edited book, Deciding Where to Live (Ocepek and Aspray 2021). For a sample of other everyday information studies, see for example the work of Denise Agosto (with Sandra Hughes-Hassell, 2005), Karen Fisher (neé Pettigrew, 1999), Tim Gorichanaz (2020), Jenna Hartel (2003), Pam McKenzie (2003), and Reijo Savolainen (2008).

Our personal involvement with research on everyday information studies began with Everyday Information (Aspray and Hayes 2011), which injected historical scholarship into studies on everyday information. In a long study of “100 Years of Car Buying”, one of us (Aspray, pp. 9 – 70 in Aspray and Hayes 2011) introduced a historical model, showing how endogenous forces (e.g., the dealership model for selling automobiles, or the introduction of foreign automobiles into the American market) and exogenous forces (e.g., war, or women entering the workforce) shaped the information questions that people were interested in and sometimes even the information sources they consulted. This volume, presenting an historical approach to everyday information behavior, included contributions by the noted historians of computing James Cortada, Nathan Ensmenger, and Jeffrey Yost.

Our collaboration began when the two of us, together with our colleague George Royer (today a game designer in Austin, TX), wrote two books about food from the perspective of information studies. We did not follow the typical approaches of food scholars, studying such topics as food pathways or food security, but instead applied the lens of information studies to this topic of wide popular interest. In the two short books that we produced, Food in the Internet Age (Aspray, Royer, Ocepek 2013) and Formal and Informal Approaches to Food Policy (Aspray, Royer, and Ocepek 2014), we discussed a wide variety of topics, such as: the online grocer Webvan (the largest loser of venture capital in the dot-com crash of 2001); the harms that Yelp, OpenTable, and Groupon created for small brick-and-mortar businesses and customers; the different ways in which the Internet has been used to represent and comment upon food and food organizations; the regulation of advertising of sweetened cereals to children; and the strategies of informal, Bully Pulpit persuasion compared to formal regulation of food and nutrition – carried out through a pair of studies: one of Eleanor and Franklin Roosevelt, and the other of Michele and Barak Obama.

This work on food, and some of our subsequent research, falls into the field of information studies. We carefully use that term instead of information science because our work is more informed by humanities (critical theory, cultural studies) and social science disciplines (sociology, psychology, organizational and management studies, work and labor studies) than by computer science, natural science, and engineering disciplines. We both have worked in information schools, part of a movement toward the interdisciplinary study of computing and information that has emerged in the past quarter century out of (1) library schools becoming more technical, (2) computer science departments becoming more interested in people and human institutions and their social impact, and (3) newly created interdisciplinary enterprises. These information schools offer a big tent for many different kinds of methods, theories, and approaches. The breadth of these studies can be seen in the wide range of papers delivered at the annual meeting of ASIST (for example, https://www.conftool.org/asist2020/index.php?page=browseSessions&path=adminSessions) and the annual "iConference" (https://ischools.org/Program). Also see the types of scholarship presented as the specialty biennial conference on "Information Seeking in Context" (ISIC, e.g., http://www.isic2018.com).

So far, there is little cross-usage of methods or approaches by people studying everyday information, e.g. by a traditional information studies scholar who studies information literacy incorporating research from data science or ubiquitous computing, but this cross-fertilization is just beginning to happen. In our own research, we do the next best thing through edited volumes which include chapters using a variety of approaches, so as to gain multiple perspectives on an issue. This is true, for example, in our book on where to live (discussed in detail below) and the book on information issues in aging (mentioned below).

The cover of the Deciding Where to Live: Information Studies on Where to Live in America Edited by Melissa G. Ocepek and William Aspray.
Figure 1: The cover of the Deciding Where to Live: Information Studies on Where to Live in America
Edited by Melissa G. Ocepek and William Aspray.


Deciding Where to Live

In our recent edited book, Deciding Where to Live, we are continuing our study of everyday phenomena through an information lens. We describe this book in some detail here to give our readers a better sense of the ways in which information studies scholars operate. All of the chapters in this book were written by people associated with leading information schools in the United States (Colorado, Illinois, Indiana, Syracuse, Texas, Washington). As with our food studies, we have taken multiple perspectives – all drawn from information studies – to investigate various aspects of housing. These studies, for example, employ work studies and business history; information, culture, and affective aspects of information issues; community studies; information behavior; and privacy.

Information scholars are often interested in the results of scholarship by labor, management, and organization scholars; and sometimes they adopt their theories and methods. These scholars are interested in such issues as the growing number of information occupations, the increased percentage of a person’s job tasks on information activities, and the ways in which tools of communication and information have changed firm strategies and industry structures. Everyday information scholars, too, are interested in these results, but primarily for what they have to say about the everyday or work lives of individuals.

The work of real estate firms, realtors, and home buyers and sellers have been profoundly changed by the massive adoption of information and communication technologies in recent years. Let us consider two chapters, by James Cortada and Steve Sawyer, from the Deciding Where to Live book. One major change in the 21st century has been the rise of websites, such as Zillow and Realtor.com, that enable individuals to access detailed information about housing without having to rely upon a realtor or the Multiple Listing Service. Using a business history approach, Cortada shows how these changes have changed the structure of the real estate industry, altered the behavior of individual firms, made the buyer and seller more informed shoppers, lowered commissions on house sales, and introduced new business models such as Zillow buying homes themselves and not just providing information about them. Some people believe that the rise of companies such as Zillow means that the imbalance between the information held by realtors and buyers is largely a thing of the past, that disintermediation by realtors is also largely over, and that the need for realtors is greatly diminished – and that we will see a radical shrinking in this occupation in the same way that the numbers of telephone operators and travel agents has plummeted. (See Yost 2008.)

Sawyer argues, however, that the work of the real estate agent is evolving rather than being eliminated. As he states his argument: “real estate agents have been able to maintain, if not further secure, their role as market intermediaries because they have shifted their attention from being information custodians to being information brokers: from providing access to explaining.” (Sawyer 2021, p. 35) As he notes, the buying of a house is a complex process, involving many different steps and many different participants (selecting the neighborhood and the particular house, inspecting the property, checking on title and transferring it, obtaining financing, remediating physical deficiencies in the property, etc.). One might say that it takes a village to sell a house in that village; and an important role of the real estate agent is to inform the buyers of the many steps in the process and use their network of specialists to help the buyers to carry out each step in a professional, timely, and cost-effective way.

Figure 2: A Zillow search page for Arvada, CO captured on April 4th, 2021
Figure 2: A Zillow search page for Arvada, CO captured on April 4th, 2021.

How do these changes affect the everyday life of the individual? There are more than 2 million active real estate agents in the United States. Their work has changed profoundly as they adapt real-estate-oriented websites and apps in their work. Even though most real estate agents work through local real estate firms, to a large degree they act largely as independent, small businesspeople who carry out much of their work from their cars and homes, as much as from their offices. So, they rely on websites and apps not only for information about individual homes, but also for lead generation, comparative market analysis, customer relationship management, tracking their business expenses such as mileage, access to virtual keys, video editing of listings, mounting marketing campaigns, and a multitude of other business functions. For those who are buyers and sellers, they can use Zillow or its online competitors to become informed buyers before ever meeting with a real estate agent, learning how much their current home is worth, figuring out how large a mortgage they can qualify for, checking out multiple potential neighborhoods not only for housing prices but also for quality of schools and crime rates, checking out photos and details of numerous candidate houses, and estimating the total cost of home ownership. Interestingly, many individuals who are not looking to buy or sell a home in the near term are regular users of Zillow. It is a way to spy on neighbors, try out possible selves, plan for one’s future, or just have a good time. In our introductory chapter, we address these issues.

Another chapter, by Philip Doty, reflects upon the American dream of the smart home. Drawing upon the scholarship in surveillance capitalism Soshanna Zuboff (2019), feminist scholarship on privacy, Anita Allen (1988), Patricia Bolling (1996), Catherine MacKinnon (1987), gender studies in history of science and technology, Ruth Cowan (1983), geography of surveillance, Lisa Makinen (2016), and other scholarly approaches, Doty reflects on the rhetorical claims about technological enthusiasm related to smart cities and smart homes, and discusses some of the privacy and in particular surveillance issues that arise in smart homes.

Information is not merely used by people in cognitive ways; it can also bring joy, sadness, anxiety, and an array of other emotions. Deciding where to live can be an exciting, fraught, and stressful experience for many people. When one is searching for a home in a particularly competitive housing market, the addition of time pressures can amp up the emotional toll of house hunting and discourage even the most excited home buyer. In her chapter, Carol Landry recounts how the high stakes decision making of home buying becomes even more complicated when time pressure and emotions come into play. Her research is based on an empirical study of home buyers in the highly competitive Seattle real estate market. The chapter describes the experience of several home buyers dealing with bidding wars that required quick decision making and many failed attempts at securing a home. The stories shared in this chapter highlight the despair and heartbreak that made continuing the home search difficult to participants described as going from enthusiastic information seekers to worn out information avoiders. This chapter highlights how internal and external factors can impact the home buying process and the information behaviors associated with it.

A competitive real estate market is but one of myriad experiences that can further complicate the process of deciding where to live. There are times in most people’s lives where the unique attributes of a life stage play an outsized role in decision-making around housing; one of these times is retirement. In Aspray’s chapter, the realities of retirement complicate the lives of individuals lucky enough to be able to retire with new considerations that shape decision making. Retirement adds new complexity to deciding where to live because the stability of work that binds many people’s lives is no longer there, creating many exciting new opportunities and constraints. Different elements shape questions around where to live for retired people including the emotional ties to their current homes, the financial realities of retirement income, and the physical limitations of aging.  

Figure 3: The cover of HGTV Magazine from January/February 2016
Figure 3: The cover of HGTV Magazine from January/February 2016.

During times of societal uncertainty, a home can be a comforting shelter that keeps the external world at bay. Even when a lot of uncertainty stems from the housing market, as it did during the Housing Crisis of 2007 and the recession that followed. As more and more people lost their homes to foreclosures or struggled to pay their mortgages, home and garden entertainment media provided a pleasant, comfortable escape for millions of Americans. Ocepek, in her chapter on home and garden sources, found that, throughout the housing crisis, recession, and recovery, home and garden sources grew or maintained their popularity with viewers and readers – likely due to the social phenomenon of cocooning or taking shelter in one’s space when the world outside becomes uncertain and scary. Both home and garden magazines and HGTV made changes to their content to represent the new home realities of many of their readers and viewers, but they also largely stayed the same, presenting comforting content about making whatever space you call home the most comfortable.

The financial hardships throughout the housing crisis, recession, and recovery were not experienced by all Americans in equal measure. Several authors in the book presented multiple examples where housing policies, economic conditions, and social unrest disproportionately affected marginalized communities throughout the United States. One is Pintar’s chapter about Milwaukee, mentioned below. Although some of the legal frameworks built to segregate cities and communities throughout the country have changed, the experience of deciding where to live for Black and African Americans adds additional layers of complexity to the already complicated process. Drawing on critical race theory, Jamillah Gabriel delineates how Black and African American house searchers (renters and buyers) create information seeking and search strategies to overcome the historic and contemporary discriminatory policies and practices of housing segregation. The chapter analyzes specialized information sources the provide useful information to help this group of house searchers find safer communities where they have the greatest chance to prosper. These sources include lists of the best and worst places for African American and Black individuals and families to live. The lists draw on research the compares communities based on schools, employment, entertainment, cost of living, housing market, quality of life, and diversity. Drawing on historic and contemporary account, the analysis provided in this chapter highlights that, “the housing industry can be a field of land mines for African American in search of home” (Gabriel 2021, p. 274).

Figure 4: Home Owner’s Loan Corporation Map of Milwaukee, Wisconsin, 1938, National Archives; image retrieved from Mapping Inequality, University of Richmond, https://dsl.richmond.edu/panorama/redlining/#loc=11/43.03/-88.196&city=milwaukee-co.-wi
Figure 4: Home Owner’s Loan Corporation Map of Milwaukee, Wisconsin, 1938, National Archives; image retrieved from Mapping Inequality, University of Richmond, https://dsl.richmond.edu/panorama/redlining/#loc=11/43.03/-88.196&city=milwaukee-co.-wi

It is often said that information and information tools are neither inherently good or bad, but that they can be used for both good and bad purposes. Two chapters in the book illustrate this point. In a study of the city of Milwaukee, Judith Pintar shows how HOLC maps, which were created to assess the stability of neighborhoods, were used to reinforce the racist practice of redlining. In another chapter, by Hannah Weber, Vaughan Nagy, Janghee Cho, and William Aspray, the authors show how information tools were used by the city of Arvada, Colorado and various groups (such as builders, realtors, parents, activists, and the town council) to improve the city’s quality of life in the face of rapid growth and its attendant issues such as traffic problems, rising housing prices, the need to build on polluted land, and the desire to protect the traditional look and feel of this small town. A third chapter, by David Hopping, showed how an experiment in Illinois was able to repurpose military housing for non-military purposes for the social good. His empirical study is seen through the lens of the theoretical constructs of heterotopia (Foucault 1970), boundary objects (Star and Griesmer 1989), and pattern language (Alexander 1977).

Conclusions

Both of us are continuing to pursue work on everyday information issues. One (Aspray) is continuing this work on information studies in everyday life, through an edited book currently in progress on information issues related to older Americans (Aspray, forthcoming in 2022). This book ranges from traditional Library and Information Science approaches about health information literacy on insurance for older Americans, the variety of information provided by AARP and its competitors, and the use of information and communication technologies to improve life in elderly communities; to more technologically oriented studies on ubiquitous computing, human-computer interaction, and Internet of Things for older people. Meanwhile, Ocepek is building on her work in her doctoral dissertation (Ocepek 2016), which examined from both social science and cultural approaches the everyday activity of grocery shopping. Her new study is examining what has happened to grocery shopping during the pandemic.

We are pleased to see the broadening in mission of the Babbage Institute to consider not only the history of computing but also the history and cultural study of information. For example, many scholars (including some computer historians) since 2016 have been studying misinformation. (See, for example, Cortada and Aspray 2019; Aspray and Cortada 2019.) This study of everyday information is another way in which the Babbage Institute can carry out its broadened mission today.

In particular, there are a few lessons for computer historians that can be drawn from the scholarship we have discussed here, although many readers of this journal may already be familiar with and practicing them:

  • One can study information as well as information technology. On the history of information, see for example Blair (2010), Headrick (2000), Cortada, (2016), and Ann Blair et al. (2021). For a review of this scholarship, see Aspray (2015).
  • One can study everyday uses of information and information technology, even if they may be regarded by some as quotidian – expensive, complex, socially critical systems are not the only kinds of topics involving information technology that are worth studying.
  • This past year has taught all of us how an exogenous force, the COVID-19 pandemic, can quickly and radically reshape our everyday lives. In the opening chapter of our book, we briefly discuss the earliest changes the pandemic brought to real estate. We are also seeing the grocery industry as well as the millions of consumers learning, adapting, and changing their information behaviors around safely acquiring food.
  • In order to study both historical and contemporary issues about information and information technology, one can blend historical methods with other methods from computer science (e.g., human-computer interaction, data science), social science (qualitative and quantitative approaches from sociology, psychology, economics, and geography), applied social science (labor studies, management and organization studies), and the humanities disciplines (cultural studies, critical theory).

These are exciting times for the historians of computing and information!


Bibliography

Agosto, Denise E. and Sandra Hughes-Hassell. (2005). "People, places, and Questions: An Investigation of the Everyday Life Information-Seeking Behaviors of Urban Young Adults." Library & Information Science Research, vol. 27, no. 2, pp. 141-163.

Alexander, Christopher. (1977). A Pattern Language. Oxford University Press.

Allen, Anita L. (1988). Uneasy Access: Privacy for Women in a Free Society. Rowman & Littlefield.

Aspray, William. (2015). The Many Histories of Information. Information & Culture, 50.1: 1-23.

Aspray, William. (forthcoming 2022). Information Issues for Older Americans. Rowman & Littlefield.

Aspray, William and James Cortada. (2019). From Urban Legends to Political Factchecking. Springer.

Aspray, William and Barbara M. Hayes. (2011). Everyday Information. MIT Press.

Aspray, William, George W. Royer, and Melissa G. Ocepek. (2013). Food in the Internet Age. Springer. 

Aspray, William, George W. Royer, and Melissa G. Ocepek. (2014). Formal and Informal Approaches to Food Policy. Springer.

Bakardjieva, Maria. (2005). Internet Society: The Internet in Everyday Life. Sage.

Blair, Ann. (2010). Too Much to Know. Yale.

Blair, Ann, Paul Duguid, Anja Silvia-Goeing, and Anthony Grafton, eds. (2021). Information: A Historical Companion. Princeton.

Boling, Patricia. (1996). Privacy and the Politics of Intimate Life. Cornell University Press.

Case, Donald O. and Lisa M. Given. (2016). Looking for Information. 4th ed. Emerald.

Cortada, James and William Aspray. (2019). Fake News Nation. Rowman & Littlefield.

Cowan, Ruth Schwartz. (1983). More Work for Mother. Basic Books.

De Certeau, Michel (1984). The Practice of Everyday Life. Translated by Steven F. Rendall. University of California Press. 

Fisher, Karen E. Sandra Erdelez, and Lynne McKechnie. (2009). Theories of Information Behavior. Information Today.

Foucault, Michel. (1970). The Order of Things. Routledge.

Gorichanaz, Tim (2020). Information Experience in Theory and Design. Emerald Publishing.

Hartel, Jenna. (2003). "The Serious Leisure Frontier in Library and Information Science: Hobby Domains." Knowledge Organization, vol. 30, No. 3-4, pp. 228-238.

Haythornthwaite, Caroline and Barry Wellman, eds. (2002). The Internet in Everyday Life. Wiley-Blackwell.

Headrick, Daniel. (2000). When Information Came of Age. Oxford.

Highmore, Ben ed. (2001). The Everyday Life Reader. Routledge.

Lefebvre, Henri. (2008). Critique of Everyday Life. vol. 1, 2nd ed. Translated by John Moore. Verso.

MacKinnon, Catherine. (1987). Feminism Unmodified. Harvard University Press.

Makinen, Lisa A. (2016). "Surveillance On/Off: Examining Home Surveillance Systems from the User’s Perspective." Surveillance & Society, 14.

McKenzie, Pamela J. (2003). "A Model of Information Practices in Accounts of Everyday‐Life Information Seeking." Journal of Documentation, vol. 59, no. 1, pp. 19-40.

Pettigrew, Karen E. (1999). "Waiting for Chiropody: Contextual Results from an Ethnographic Study of the Information Behaviour Among Attendees at Community Clinics." Information Processing & Management. vol. 35, no. 6, pp. 801-817.

Ocepek, Melissa G. (2016). "Everyday Shopping: An Exploration of the Information Behaviors of the Grocery Shoppers." Ph.D Dissertation, School. Of Information, University of Texas at Austin.

Ocepek, Melissa G. and William Aspray, eds. (2021). Deciding Where to Live. Rowman & Littlefield.

Savolainen, Reijo. (2008). Everyday Information Practices: A Social Phenomenological Perspective. Scarecrow Press. 

Smith, Dorothy E. (1987). The Everyday World as Problematic: A Feminist Sociology. Northeastern University Press. 

Star, Susan Leigh and James R. Griesemer. (1989). "Institutional Ecology, Translations, and Boundary Objects: Amateurs and Professionals in Berkeley’s Museum of Vertebrate Zoology, 1907-39." Social Studies of Science 19, 3: 387-420.

Steedman, Carolyn. (1987). Landscape for a Good Woman: A Story of Two Lives. Rutgers University Press. 

Yost, Jeffrey R. (2008). “Internet Challenges for Nonmedia Industries, Firms, and Workers.” pp. 315-350 in William Aspray and Paul Ceruzzi, eds., The Internet and American Business. MIT Press.

Zuboff, Shoshanna. (2019). The Age of Surveillance Capitalism. Public Affairs.

 

Aspray, William and Ocepek, Melissa G. (April 2021). “Everyday Information Studies: The Case of Deciding Where to Live." Interfaces: Essays and Reviews in Computing and Culture Vol. 2, Charles Babbage Institute, University of Minnesota, 27-37.


About the authors:

Melissa G. Ocepek is an Assistant Professor at the University of Illinois Urbana-Champaign in the School of Information Sciences. Her research draws on ethnographic methods and institutional ethnography to explore how individuals use information in their everyday lives. Her research interests include everyday information behavior, critical theory, and food. Recently, she co-edited Deciding Where to Live (Rowman & Littlefield, 2021) with William Aspray. Previously she published two books that address the intersection of food, information, and culture: Food in the Internet Age and Formal and Informal Approaches to Food Policy (both with William Aspray and George Royer). Dr. Ocepek received her Ph.D. at the University of Texas at Austin in the School of Information.

William Aspray is Senior Research Fellow at CBI. He formerly taught in the information schools at Indiana, Texas, and Colorado; and served as a senior administrator at CBI, the IEEE History Center, and Computing Research Association. He is the co-editor with Melissa Ocepek of Deciding Where to Live (Rowman & Littlefield, 2021). Other recent publications include Computing and the National Science Foundation (ACM Books, 2019, with Peter Freeman and W. Richards Adrion); and Fake News Nation and From Urban Legends to Political Fact-Checking (both with James Cortada in 2019, published by Rowman & Littlefield and Springer, respectively). 


 

Of Mice and Mentalité: PARC Ways to Exploring HCI, AI, Augmentation and Symbiosis, and Categorization and Control

Jeffrey R. Yost, Charles Babbage Institute, University of Minnesota

Abstract: This think piece essay comparatively explores history and mindsets with human-computer interaction (HCI) and artificial intelligence (AI)/Machine Learning (ML). It draws on oral history and archival and other research to reflect on the institutional, and cultural and intellectual history of HCI (especially the Card, Moran, and Newell team at Xerox PARC) and AI. It posits the HCI mindset (focused on augmentation and human-machine symbiosis, as well iterative maintenance) could be a useful framing to rethink dominant design and operational paradigms in AI/ML that commonly spawn, reinforce, and accelerate algorithmic biases and societal inequality.

(PDF version available for download.)

 

First mouse
First Computer Mouse prototype designed and developed by Douglas Engelbart, William English, and their ARC Team at SRI. SRI International, CC BY-SA 3.0 <https://creativecommons.org/licenses/by-sa/3.0>, via Wikimedia Commons

This essay briefly recounts the 1982 professional organizational founding for the field of Human-Computer Interaction (HCI) before reflecting on two decades prior in interactive computing—HCI’s prehistory/early history—and its trajectory since. It comparatively explores history and mindsets with HCI and artificial intelligence (AI). For both HCI and AI, “knowing users” is a common target, but also a point of divergent departure.

For AI—especially large-scale, deployed systems in defense, search, and social networking—knowing users tends to involve surveillance, data collection, and analytics to categorize and control in the service of capital and power. Even when aims are purer, algorithmic biases frequently extend from societal biases. Machines can be programed to discriminate or learn it from data and data practices.

For HCI—from idealistic 1960s beginnings through 1980s professionalization and beyond—augmenting users and human-machine symbiosis has been its core. While an HCI-type mindset offers no magic bullet to AI’s ills, this essay posits that it can be a useful framing, a reminder toward proper maintenance, stewardship, and structuring of data, design, code (software), and codes (legal, policy, and cultural). HCI systems, of course, can be ill designed, perform in unforeseen ways, or users can misapply them, but this likely is less common and certainly is of lesser impactful scale relative to AI. Historians and sociologists must research the vast topics of AI and HCI more fully in many contexts and settings.

HCI and Solidifying the Spirit of Gaithersburg

In mid-March 1982 IIT Programming Technology Center’s Bill Curtis and University of Maryland’s Ben Shneiderman held the first “Human Factors in Computing Systems” conference in Gaithersburg, Maryland. The inspiring event far exceeded the organizers’ expectations, attracting more than 900 attendees. It was the pivotal leap forward in professionalizing HCI. 

Rich program content filled the three-day program, as impactful organizational work occurred at an evening, small group side meeting. At the latter, Shneiderman, Curtis, UCSD’s Don Norman, Honeywell’s Susan Dray, Northwestern’s Loraine Borman, Xerox PARC's (Palo Alto Research Center) Stuart Card and Tom Moran, and others strategized about HCI’s future and possibilities for forming an association within a parent organization. Borman, an information retrieval specialist in a leadership role at ACM SIGSOC (Social and Behavioral Computing), and Shneiderman, a computer scientist, favored the Association for Computing Machinery (ACM). Insightfully seeing an expedient workaround, Borman proposed SIGSOC transform—new name/mission—bypassing the need for a new SIG approval.

The Design of Everyday Things book cover

 

Cognitive scientist Don Norman questioned whether ACM should be the home, believing computer science (CS) might dominate. After debate, Shneiderman and Borman’s idea prevailed. Dray recalls, the sentiment was “we can’t let the spirit of Gaithersburg die,” and for most, SIGSOC’s metamorphous seemed a good strategy (Dray 2020). Borman orchestrated transforming SIGSOC into SIGCHI (Computer-Human Interaction). The CHI tail essentially became the dog (SOC’s shrinking base mainly fit under HCI’s umbrella). Interestingly, “Computer” comes first in the acronym, but likely just to achieve a pronounceable word in the ACM SIG style, as “HCI” appeared widely in early CHI papers (SIGCHI’s annual conference).

Norman’s concerns proved prescient. SIGCHI steadily grew reaching over 2,000 attendees by the 1990 Seattle CHI, but in its first decade, it principally furthered CS research and researchers. Scholarly standards rose, acceptance rates fell, and some practitioners felt crowded out. In 1991, practitioners formed their own society, User Experience Professional Association (UXPA). In the 1990s and beyond, SIGCHI blossomed into an increasingly (academic) discipline diverse organization.

As with all fields/subfields, HCI has a prehistory or an earlier less organizationally defined history (for HCI, the 1960s and 1970s). SIGCHI’s origin lay in the confluence of: past work in human factors; university “centers of excellence” in interactive computing created through 1960s Advanced Research Projects Agency (ARPA) Information Processing Techniques Office (IPTO) support; two particularly impactful laboratories (PARC and SRI’s ARC); Systems Group artists in the UK; and the promise of Graphical User Interface (GUI) personal computers (PCs).

Nonprofit corporation SRI’s Augmentation Research Center (ARC), and Xerox’s PARC were at the forefront of GUI and computer mouse developments in the 1970s and 1980s. Neither the GUI nor mouse R&D were secret at PARC; in the 1970s, many visitors saw Alto demos, including, in 1979, Steve Jobs/Apple Computer team. In 1980 Apple hired away PARC’s Larry Tesler and others. Jobs launched the Apple Lisa effort (completed in 1983, priced at $10,000), which like the even more expensive Xerox Star (1981), possessed a GUI and mouse. The 1984 Apple Macintosh, retailing at $2,500, initiated an early mass market for GUI personal computers—inspiring initiators, most notably, Microsoft Windows 2.0 in 1987.

In early 2020, I conducted in-person oral history interviews with three of HCI’s foremost intellectual and organizational pioneers—the pilot for a continuing ACM/CBI project. This included UCSD Professor Don Norman (SIGCH Lifetime Research Awardee; Benjamin Franklin Medalist), Xerox PARC Scientist and Stanford Professor Stuart Card (SIGCHI Lifetime Research Awardee; National Academy of Engineering), and Dr. Susan Dray (SIGCHI Lifetime Practice Awardee; UXPA Lifetime Achievement Awardee).

Don Norman is well-known both within and outside CS—extending from his 1988 book The Psychology of Everyday Things (POET), re-released as wide selling, The Design of Everyday Things. A student of Duncan Luce (University of Pennsylvania), he was among the first doctorates in mathematical psychology. Early in his career, he joined the UCSD Psychology Department as an associate professor. After stints at Apple and Hewlett-Packard, and at Northwestern, he returned to lead the UCSD Design Laboratory. Norman helped take design from its hallowed ground of aesthetics to establish it in science, and greatly advanced understanding and practice of usability engineering.

Susan Dray & Norman's POET
HCI Scientist/Entrepreneur Susan Dray (left) and Norman's POET. 

Norman stressed to me that there is one scientist so consistently insightful he never misses his talks at events he attends, PARC’s Stuart Card. Card was the top doctoral student of Carnegie Mellon Professor of Cognitive Psychology and Computer Science Allen Newell. While these two interviews were in California, my interview with Dr. Susan Dray was in Minneapolis, with the scientist who pioneered the first corporate usability laboratory outside the computer industry (IBM and DEC had ones) at American Express Financial Advisors (AEFA).

Dray took a different path after her doctorate in psychology from UCLA—human factors—on classified Honeywell Department of Defense (DoD) projects. In the early 1980s, Honeywell, a pioneering firm in control systems, computers, and defense-contracting, had a problem with ill-adapted computing in its headquarters for clerical staff, which Dray evaluated. This became path defining for her career, toward computer usability. After pioneering HCI work at Honeywell, Dray left for American Express, and later became a successful and impactful HCI consultant/entrepreneur. She applied observations, ethnographic interviewing, and the science of design to improve interaction, processes, and human-machine symbiosis in cultures globally, from the U.S., South Africa, Egypt and Jordan to India, Panama, and France.

Earlier, in the late 1980s, at American Express, Dray was seeking funds for a usability lab, and she creatively engaged in surreptitious user feedback. She bought a “carton” of Don Norman’s POET book, had copies delivered to all AEFA senior executives on the top/29th floor, and rode up and down the elevator starting at 6 am for a couple hours each morning for weeks, listening to conversations concerning this mysteriously distributed book on the science of design. Well-informed, she pitched successfully, gaining approval for her usability lab.

The Norman, Card, and Dray oral histories, another HCI interview I just conducted, with artist Dr. Ernest Edmonds, my prior interview with Turing Awardee Butler Lampson of Alto fame, preparation for these five interviews, and AI and HCI research at the CBI, MIT, and Stanford University archives inform this essay.

For AI and HCI, Is There a Season?

Microsoft Research Senior Scientist, Jonathan Grudin—in his valuable From Tool to Partner (2017) on HCI’s history—includes a provocative argument that HCI thrives during AI Winters and suffers during AI’s other seasons. The usefulness of the widespread Winter metaphor is debatable, it is based on changing funding levels to elite schools (Mendon-Plasek, 2021 p. 55), but Grudin’s larger point—only one of the two fields thrives at a time—hints to a larger truth: HCI and AI have major differences. The fields overlap with some scientists and some common work but have distinct mindsets. Ironically, AI, once believed to be long on promises and short on deliveries (the rationalized basis for AI Winters), is now delivering stronger, and likely, more harmfully than ever given algorithmic and data biases in far reaching corporate and government systems.

Learning How Machines Learn Bias

Increasingly more and more of our devices are “smart,” a distracting euphemism obscuring how AI (in ever-increasingly interconnected sensor/IoT/cloud/analytics systems) reinforces and extends biases based on race, ethnicity, gender, sexuality, and disability. Recent interdisciplinary scholarship is exposing roots of discriminatory code (algorithms/software) and codes (laws, policy, culture), including deeply insightful keynotes at the Charles Babbage Institute’s (CBI) “Just Code” Symposium (a virtual, major event with 345 attendees in October 2020) by Stephanie Dick, Ya-Wen Lei, Kenneth Lipartito, Josh Lauer, and Theodora Dryer. Their work contributes to an important conversation also extended in important scholarship by Ruha Benjamin, Safiya Noble, Matt Jones, Charlton McIlwain, Danielle Allen, Jennifer Light (MIT; and CBI Sr. Research Fellow), Mar Hicks, Virginia Eubanks, Lauren Klein, Catherine D’Ignazio, Amanda Menking, Aaron Mendon-Plasek (Columbia; and current CBI Tomash Fellow), and others.

AI did not merely evolve from a benevolent past to a malevolent present. Rather, it has been used for a range of different purposes at different times. Geometrically expanding the number of transistors on chips—the (partially) manufactured and manufacturing/fabrication trajectory of Moore’s Law—enabled computers and AI to become increasingly powerful and pervasive. Jennifer Light’s insightful scholarship on the RAND Corporation’s 1950s and 1960s operations research, systems engineering, and AI, created in the defense community, and later misapplied to social welfare, counters notions of an early benevolent age. Even if chess is the drosophila of AI—a phrase of John McCarthy’s from the 1990s—its six-decade history is one of consequential games, power contests. Work in computer rooms at the Pentagon’s basement and at RAND harmfully escalated Cold War policies as DoD/contractors simulated and supported notions of the U.S. rapidly “winning” the Vietnam War, and earlier, C-E-I-R (founded by ex-RAND scientists) used input/output-economics algorithmic systems to determine optimal bomb targets to decimate the Soviet Union industrially (Yost, 2017).

What helped pulled AI out of its first long (1970s) Winter was successes and momentum with expert systems—the pioneering work of Turing Awardee Stanford AI scientist Edward Feigenbaum and molecular biologist and Nobel Laureate Joshua Lederberg’s late 1960s Dendral, to advance organic chemistry, and Feigenbaum and others’ early 1970s MYCIN in medical diagnostics and therapeutics. These AI scientific triumphs stood out and lent momentum for expert systems, as did fears of Japan’s Fifth Generation (early 1980s—government and industry partnership in AI/systems). In the 1980s, elite US CS departments again received strong federal support for AI. Work in expert systems in science, medicine, warfare, and computer intrusion detection abounded (Yost, 2016).

Some AI systems are born biased; others learn it—from algorithmic tweaks to expert system inference engines to biased data. Algorithmic bias is just one of the many problematic byproducts of valuing innovation over maintenance (Vinsel and Russell 2020, Yost 2017).

Human Factors and Ergonomics

The pre-history/early history of human-machine interaction dates back many decades to the control of workers and soldiers to maximize efficiency. The late-1950s-spawned Human Factors Engineering Society grew out late inter-war period organizational work of the Southern California aerospace industry. In the first half of the 20th century, human factors had meaningful roots in the scientific management thought, writings, and consulting of Frederick Winslow Taylor. This tradition defined the worker as an interchangeable part, a cog within the forces of production to efficiently serve capital. At Taylorist-inspired and organized factories, management oppressed laborers, and human factors has a mixed record in its targets, ethics, and outcomes. However, in HCI’s organizational start, early 1980s, the mantra was not merely of efficiency; it was the frequently uttered, “know the user.” This, importantly, was a setting of personal computing and GUI idealism, a trajectory insightfully explored by Stanford’s Fred Turner in From Counterculture to Cyberculture.

Xerox Palo Alto Research Center (PARC)
Image of the Xerox Palo Alto Research Center (PARC) in 1977, in the 1970s and 1980s PARC had an incredible team of some of the world’s top computer scientists. [By Dicklyon- Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=95847361]

We’re on a Road to Intertwingularity, Come on Inside

Years before the National Science Foundation (NSF) took the baton to be the leading federal funder of basic CS research at universities, ARPA’s IPTO (following 1962 founding director’s J.C.R. Licklider’s vision), changed the face of computing toward interaction.  Well known philosopher and sociologist Ted Nelson, a significant HCI contributor of the 1960s and 1970s, creatively coined the term “intertwingularity” of the symbiosis and all being intertwined or connected (networking, text through his term/concept of “hypertext,” human user with interactive computing)—it can aptly describe the multifaceted HCI work of 1960’s IPTO-funded SRI’s ARC and 1970s Xerox PARC.

The 1970-enacted Mansfield Amendment required direct and defined DoD function for all DoD research funding. It left a federal funding vacuum for years until NSF could ramp up to become a roughly comparable basic funder for the interactive computing that IPTO started. The vacuum, however, was largely filled by a short golden age of corporate industrial research in interactive computing at Xerox, a firm with a capital war chest, much dry powder, from its past photocopier patent-based monopoly, and seeking to develop the new, new thing(s). Xerox looked to its 1970-launched PARC to invent the office of the future. It hired many previously IPTO-supported academic computer scientists, it produced and employed a cadre of Turing Awardees, an unprecedented team far exceeding any single university’s CS department in talent or resources.

Inside the PARC Homeruns

Douglas Engelbart and the earliest work on the first mouse designed by him and SRI’s Bill English is addressed by French Sociologist Thierry Bardini in Bootstrapping, a biography of Engelbart. Journalists, such as Michael Hiltzik, have covered some major contours of technical innovation at PARC.

Central to Bardini’s and Hiltzik’s and others’ narratives is the important HCI work of Turing Awardees Douglas Engelbart at SRI; and Butler Lampson, Alan Kay, Charles Thacker, and Charles Simonyi at PARC. In this essay I look beyond oft-told stories and famed historical actors in GUIs and mice to briefly discuss a hitherto largely overlooked, highly impressive small PARC research team composed of Newell, Card, and Moran, and a larger team that Card later led. The incredible accomplishments of Lampson and others changed the world with the GUI. They hit the ball out park, so to speak—"a shot heard round the world” (1951 Bobby Thompson Polo Grounds, Don DeLillo immortalized, homerun sense) that very visibly revolutionized interactive computing.

Newell is one of the most famous of the first-generation AI scientists, a principal figure at John McCarthy’s famed Dartmouth Summer 1956 Workshop, in which McCarthy, Newell, Herbert Simon, Marvin Minsky, and others founded and gave name to the field—building upon earlier work of Alan Turing. On a project launched in 1955, Newell, as lead, co-invented (with Simon and Clifford Shaw) “The Logic Theorist” in 1956, the first engineered, automated logic or AI program. Many historian and STS colleagues I have spoken with associate Newell solely with AI, and they are unaware of his PARC HCI work. Unlike Turing and Simon, Newell does not have a major biography documenting the full breadth of his work. Newell’s HCI research has been neglected by historians, as has that of his two top students, Card and Moran. They published many seminal HCI papers in Communications of the ACM and other top journals.

This oversight (by historians, they were revered by fellow scientists), especially neglecting career long contributions of Card and Moran, is a myopic favoring of first-recognized invention over subsequent ones, missing key innovations, and devaluing maintenance. It was not merely the dormouse (mouse co-inventors Engelbart and English, the recognized revolution), but multiple dormice (the science and engineering behind optimizing mice for users). Remember(ing) what the dormice said (and with an open ear of historical research), Card and Moran clearly conducted brilliant scientific research spawning many quiet revolutions.

Stuart Card
Xerox PARC’s and Stanford University’s Stuart Card

Rookie Card to All-Star Card, Pioneering HCI Scientist Stuart Card

Stuart Card was first author of a classic textbook, Psychology of Human-Computer Interaction, with co-authors Newell and Moran. Card progressed through various research staff grades and in 1986 became a PARC Senior Research Scientist. Two years later, he became Team Leader of PARC’s User Interface Research Group. The breadth and contributions of Card and PARC’s HCI research in the 1970s to 1990s is wide in both theory and practice. The work fell into three broad categories: HCI Models, Information Visualization, and Information Retrieval—and major contributions in each is breathtaking. One early contribution in HCI models was Card and the team’s work on the mouse and its performance by an information-theoretical model of motor movement, Fitts’ Law, using a processing rate parameter of 10 bits/sec, roughly at the same performance ability as the hand, demonstrating performance was not limited by the device/mouse in terms of speed, but by the hand itself. It proved a mouse was optimized to interact with humans. This impacted the development of the Xerox Star mouse in 1981 and the earliest computer mice developed by Apple Computer. Card’s, and his team’s, work was equally profound on information visualization, in areas such as Attentive-Reactive Visualizer and Visualizer Transfer Functions. In information retrieval, they did advanced Information Foraging Theory.

While staying at PARC for decades, Card concurrently served as a Stanford University Professor of Computer Science. He became a central contributor to SIGCHI and was tremendously influential to academic, industrial, and government scientists.

In listening to Card’s interview responses (and deeply influenced by my Norman, Dray, and Butler Lampson interviews also, as well as by my past research), I reflected that many AI scientists could learn much from such a mindset of valuing users, all users—knowing users to help augment, for symbiosis, not to control. AI scientists, especially on large scale systems in corporations and government (much ethical AI research is done at universities), could benefit in not merely technical ways, as Steve Jobs and others did from their day in the PARC, but from Card and his team’s ethos and ethics.

Professionalizing HCI: Latent Locomotion to Blissful Brownian Motion

While SIGCHI unintentionally pushed out many non-scientists in the 1980s, it and the HCI field shed strictly a computer science and cognitive science focus to become ever more inclusive of a wide variety of academic scientists, engineers, social scientists, humanities scholars, artists, and others from the 1990s forward. CHI grew from about 1,000 at the first events in Gaithersburg and Boston to more than 3,600 attendees at some recent annual CHI meetings (and SIGCHI now has more than two-dozen smaller conferences annually). The SIGCHI/CHI programs and researchers are constantly evolving and exploring varying creative paths that from a 30,000-foot vantage might seem to be many random walks, Brownian motion. The research, designing to better serve users, contributes to many important trajectories. The diversity of disciplines and approaches can make communication more challenging, but also more rewarding, and to a high degree a Galison-like trading zone exist in interdisciplinary SIGCHI and HCI.

One example is the Creativity and Cognition Conference co-founded by artists/HCI scientists Ernest Edmonds and Linda Candy in 1993 that became a SIGCHI event in 1997. It brings together artists, scientists, engineers, and social scientists to share research/work on human-computer interaction in art and systems design. As Edmonds related to me, communication and trust between artists and scientists takes time, but is immensely valuable. Edmonds is an unparalleled figure in computer generative and interactive art, and a core member of the Systems Group of principally UK computer generative artists. In addition to many prestigious art exhibitions in the 1970s (and beyond), Edmonds published on adaptive software development, with critique of the waterfall method. His work—in General Systems in 1974—anticipated and helped to define adaptive techniques, later referred to as agile development. Edmonds, through his artist, logician, and computer science lenses insightfully saw interactive and iterative processes, a new paradigm in programming technique, art, and other design.

HCI research, and its applications, certainly is not always in line with societal good, but it has an idealistic foundation and values diversity and interdisciplinarity. Historians still are in the early innings of HCI research. Elizabeth Petrick has done particularly insightful scholarship on HCI and disability (2015).

Coding and Codifying, Fast and Slow

Nobel Laureate Daniel Kahneman has published ideas on human cognition that are potentially useful to ponder with regard to AI and HCI. Kahneman studies decision-making, and judgement, and how different aspects of these arise from how we think—both fast, emotionally, unconsciously, and instinctively; and slow, more deeply and analytically.

Programming projects for applications and implementation of systems are often behind schedule and over-budget. Code, whether newly developed or recycled, often is applied without an ethical evaluation of its inherent biases.

HCI often involves multiple iterations with users, usability labs, observation in various settings, ethnographic interviewing, and an effective blend of both inspiring emotional-response, fast thinking, and especially, deep reflective slow thinking. This slow and analytical thinking and iterative programming (especially maintenance, and endless debugging) could potentially be helpful in beginning to uproot underlying algorithmic biases. Meanwhile, slow, and careful reflection on how IT laws, practices, policies, culture, and data are codified is instructive. All of this involves ethically interrogating the what, how, why, and by and for whom of innovation, and valuing maintenance labor and processes, not shortchanging maintenance in budget, respect, or compensation.

Beyond “Laws” to Local Knowledge

In 1967 computer scientist, Melvin Conway, noted (what became christened) Conway’s Law—computer architecture reflects the communication structure of the underlying organization where it was developed (made famous by Tracy Kidder’s The Soul of a New Machine). Like Moore’s Law, Conway’s Law is really an observation, and a self-fulfilling prophecy. Better understanding and combatting biases at the macro is critical. Also essential is evaluation and action at the local and organizational levels. How does organizational culture structure algorithms/code? What organizational policies give rise to what types of code? What do (end) users, including and especially marginalized individuals and groups, have to say on bias? How do decisions at the organizational level reinforce AI/ML algorithmic and data biases, and reinforce and accelerate societal inequality? These are vital questions to consider through many future detailed cases studies in settings globally. The goal should not be a new “law,” but rather a journey to gain local knowledge and learn how historical, anthropological, and sociological cases inform on code and codes toward policies, designs, maintenance, and structures that are more equitable.

“Why Not Phone Up Robinhood and Ask Him for Some Wealth Distribution”

The lyric above from the 1978 reggae song “White Man in a Hammersmith Palais,” by The Clash, might be updated to why not open a Robinhood app… (at least until it suspended trading). How historians later assess the so-called Robinhood/Reddit “Revolution” a transfer of $20 billion away from hedge funds/banks/asset managers over several weeks in early 2021 (punishing bearish GameStop shorting by bidding up shares to force short covering), remains to be seen. Is it a social movement, and of what demographic makeup and type? For many, it likely, at least in part, is a stand against Wall Street, and thus Zuccotti Park comparisons seem apropos. Eighty percent of stock trading volume is automated—algorithmic/programmed (AI/ML)—contributing to why a 2021 CNBC poll showed 64 percent of Americans believe Wall Street is rigged. Like capitalism, equities markets and computers combine as a potent wealth concentrating machine—one turbocharged in pandemic times and fueled by accommodative monetary policy. “Smart” systems/platforms in finance, education, health, and policing all accelerated longstanding wealth, health and incarceration gaps and divergences to hitherto unseen levels. Not to dismiss volatility or financial risk to the Reddit “revolutionaries,” but the swiftness of regulatory calls by powerful leaders is telling. It begs questions on priorities: regulation for who, of what, when, and why? U.S. IT giants using AI to surveille, and to dominate with anti-competitive practices has gone largely unregulated (as has fintech) for years. Given differential surveillance, Blacks, Indigenous, People of Color (BIPOC) suffer differentially. The U.S. woefully lags Europe on privacy protections and personal data corporate taxes. U.S. racial violence/murders by police disgracefully dwarfs other democratic nations, and America stands out for Its (police and courts) embracement of racially biased facial recognition technology (FRT) and recidivist predictive AI—such as Clearview FRT and Northpointe’s (now Equivant) Corrective Offender Management Profiling for Alternative Sanctions (COMPAS).         

Meanwhile parallel Chinese IT giants Baidu, Alibaba, and Tencent, dominant in search, e-commerce, and social networking respectively, use intrusive AI. These firms (fostered by the government), ironically, are also contributing to platforms enabling a “contentious public sphere.” (Lei 2017).

At times, users can appropriate digital computing tools against the powerful in unforeseen ways. Such historical agency is critical to document and analyze. History informs us that AI/ML, like many technologies, left unchecked by laws, regulations, and ethical scrutiny will continue to be powerfully accelerating tools of oppression.

Raging Against Machines That Learn

U.S. headquartered AI-based IT corporate giants’ record on data and analytics policy and practices have garnered increasing levels of critique by journalists, academics, legislators, activists, and others. The New York Times has reported on clamp downs on employees expressing themselves on social and ethical issues. The co-leader of Google’s Ethical AI Group Timnit Gebru tweeted in late 2020 she was fired for sending an email encouraging minority hiring and drawing attention to bias in artificial intelligence. Her email included, “Your life starts getting worse when you start advocating for underrepresented people. You start making the other leaders upset.” (Metz and Wakabayashi 2020).

On June 30, 2020, U.S. Senators Robert Menendez, Mazie Hirono, and Mark Warner wrote Facebook CEO Mark Zuckerberg critiquing his company for failing to “rid itself of white supremacist and other extremist content.” (Durkee 2020). A subsequent Facebook internal audit called for better AI—a tech fix. Deep into 2019 Zuckerberg (with a lack of clarity, as at Georgetown in October 2019) sought to defend Facebook’s policies on the basis of free speech. More concerning than his inability to execute free speech arguments is the lack of transparency and the power wielded by a platform with 2.5 billion users, it has immense power to subvert democracy and to differentially harm. It has a clear record of profits over principles. In mid-2020 The Color of Change, NAACP, National Hispanic Media Coalition and others launched the “Stop Hate for Profit” boycott on Facebook advertising for July 2020, more than 1,200 organizations participated. Pivoting PR in changing political winds, Zuckerberg is seeking to shift responsibility to Congress asking it to regulate (Facebook’s legal team likely will defend the bottom line).

Data for Black Lives, led by Executive Director Yeshimabeit Milner, is an organization and movement of activists and mathematicians. It focuses on fighting for possibilities for data use to address societal problems and fighting against injustices, stressing “discrimination is a high-tech enterprise.” It recently launched, Abolish Big Data, “a call to action to reject the concentration of Big Data in the hands of the few, to challenge the structures that allow data to be wielded as a weapon…” (www.d4bl.org). This organization is an exemplar of vital work for change underway, and also of the immense challenge ahead given the power of corporations and government entities (NSA, CIA, FBI, DoD, police, courts).

HCI, never the concentrating force AI has become, continues to steadily grow as a field—intellectually, in diversity, and in importance. It has a record of embracing diversity, helping to augment and advance human and computer symbiosis. More historical work on HCI is needed, but it offers a useful mindset.

Given AI historical scholarship to date, we know its record has been mixed from the start. From its first decades of 1950s and 1960s to today, DoD, NSA/CIA/FBI, Police, and criminal justice have been frequent funders, deployers and users of AI systems plagued with algorithmic biases that discriminate against BIPOC, women, the LGBTQIA, and the disabled. Some of the most harmful systems have been with facial recognition and predictive policing. Yet, properly designed, monitored, and maintained, AI offers opportunities for science, medicine, and social services (especially at universities and nonprofits).

The social science, humanities, and arts can have a fundamental positive role on the design, structuring, and policies with AI/ML. A handful of universities recently have launched interdisciplinary centers to focus on AI, history, and society. This includes the recently formed AI Now Institute at NYU (2017) and the Institute for Human-Centered AI at Stanford (2019). The Charles Babbage Institute has made the interdisciplinary social study of AI and HCI a focus (with “Just Code” and beyond)—research, archives, events, oral histories, and publications.  In CS, ACM’s (2018 launched) Conference on Fairness, Accountability, and Transparency (FAccT), offers a great forum. Outside academe many are doing crucial research, policy, and activist work—a few examples: Data for Black Lives; Blacks in Technology; NC-WIT; AnitaB.org; Algorithmic Justice League; Indigenous AI.Net; Algorithmic Bias Initiative, (U. of Chicago).

The lack of U.S. regulation to date, discrimination and bias, corporate focus and faith on tech fixes, inadequate transparency, corporate imperialism, and overpowering employees and competitors have many historical antecedents inside and outside computing. History—the social and policy history of AI and HCI, as well as other labor, race, class, gender, and disability history—has much to offer. It can be a critical part of a broad toolkit to understand, contextualize, and combat power imbalances—to better ensure just code and ethically shape and structure the ghost in the machine that learns.

Acknowledgments: Deep thanks to Bill Aspray, Gerardo Con Diaz, Andy Russell, Loren Terveen, Honghong Tinn, and Amanda Wick for commenting on a prior draft.


Bibliography

Allen, Danielle and Jennifer S. Light. (2015). From Voice to Influence: Understanding Citizenship in a Digital Age. University of Chicago Press.

Alexander, Jennifer. (2008). The Mantra of Efficiency: From Waterwheel to Social Control. Johns Hopkins University Press.

Bardini, Thierry. (2000). Bootstrapping: Coevolution and the Origins of Personal Computing. Stanford University Press.

Benjamin, Ruha. (2019). Race After Technology: Abolitionist Tools for the New Jim Code. Polity.

Card, Stuart K., Thomas Moran, and Allen Newell (1983). The Psychology of Human-Computer Interaction. Lawrence Erlbaum Associates.

Card, Stuart K., Oral History (2020). Conducted by Jeffrey R. Yost, Los Altos Hills, CA, February 17, 2020. CBI, UMN.

Dick, Stephanie. (2020). “NYSIIS, and the Introduction of Modern Digital Computing to American Policing.” Just Code: Power, Inequality, and the Global Political Economy of IT (Symposium presentation: Oct. 23). [Hereafter “Just Code” Symposium]

D’Ignazio, Catherine and Lauren Klien. (2020). Data Feminism. MIT Press.

Dray, Susan, Oral History (2020). Conducted by Jeffrey R. Yost, CBI, Minneapolis, Minnesota, January 28, 2020. CBI, UMN.

Durkee, Alison. (2020). Democratic Senators Demand Facebook Answer For Its White Supremacist Problem.” Forbes. June 30. (accessed online at Forbes.com).

Dryer, Theodora. (2020). “Streams of Data, Streams of Water: Encoding Water Policy and Environmental Racism.” Just Code” Symposium.

Edmonds, Ernest. (1974). “A Process for the Development of Software for Non-Technical Users as an Adaptive System.” General Systems 19, 215-218.

Eubanks, Virginia. (2019). Automating Inequality: How High-Tech Tools Punish, Police, and Punish the Poor. Picador.

Galison, Peter. (1999) “Trading Zone: Coordinating Action and Belief.” In The Science Studies Reader, ed. by Mario Biagioli. Routledge. 137-160.

Grudin, Jonathan. (2017). From Tool to Partner: The Evolution in Human-Computer Interaction. Morgan and Claypool.

Hiltzik, Michael. (2009). Dealers in Lightening: Xerox PARC and the Dawn of the Computer Age. HarperCollins.

Kahnman, Daniel. (2011). Thinking, Fast and Slow. Farrar, Straus, and Giroux.

Kidder, Tracy. (1981). Soul of a New Machine. Little, Brown, and Company.

Lampson, Butler, Oral History (2014). Conducted by Jeffrey R Yost, Cambridge, Massachusetts, December 11, 2014. Charles Babbage Institute, UMN.

Lauer, Josh and Kenneth Lipartito. (2020) “Infrastructures of Extraction: Surveillance Technologies in the Modern Economy.” Just Code” Symposium.

Light, Jennifer S. (2005). From Warfare to Welfare: Defense Intellectuals and Urban Problems in Cold War AmericaUniversity of Chicago Press.

McIlwain, Charlton. (2020). Black Software. The Internet and Racial Justice: From AfroNet to Black Lives Matter. Oxford University Press.

Mendon-Plasek, Aaron. (2021). “Mechanized Significance and Machine Learning: Why It Became Thinkable and Preferable to Teach Machines to Judge the World.” In J. Roberge and M. Castelle, eds. The Cultural Life of Machine Learning. Palgrave Macmillan, 31-78.

Menking, Amanda and Jon Rosenberg. (2020). WP:NOTWP:NPOV, and Other Stories Wikipedia Tells Us: A Feminist Critique of Wikipedia's Epistemology.” ScienceTechnology, & Human Values, May, 1-25. 

Metz, Cade and Daisuke Wakabayashi. (2020). “Google Researcher Says She was Fired Over Paper Highlighting Bias in AI.” New York Times, Dec. 2, 2020.

Norman, Don, Oral History. (2020). Conducted by Jeffrey R. Yost, La Jolla, California, February 12, 2020. CBI, UMN.

Noble, Safiya Umoja. (2018). Algorithms of Oppression: How Search Engines Reinforce Racism. New York University Press.

Petrick, Elizabeth. (2015). Making Computers Accessible: Disability Rights and Digital Technology. Johns Hopkins University Press.

Turner, Fred. (2010). From Counterculture to Cyberculture: Stewart Brand, the Whole Earth Network, and the Rise of Digital Utopianism. University of Chicago Press.

Vinsel, Lee and Andrew L. Russell. (2020). The Innovation Delusion: How our Obsession with the New has Disrupted the Work That Matters Most. Currency.

Yost, Jeffrey R. (2016). “The March of IDES: Early History of Intrusion Detection Expert Systems.” IEEE Annals of the History of Computing 38:4, 42-54.

Yost, Jeffrey R. (2017). Making IT Work: A History of the Computer Services Industry. MIT Press.

 

Yost, Jeffrey R. (March 2021). “Of Mice and Mentalité: PARC Ways to Exploring HCI, AI, Augmentation and Symbiosis, and Categorization and Control". Interfaces: Essays and Reviews in Computing and Culture Vol. 2, Charles Babbage Institute, University of Minnesota, 12-26.


About the author:  Jeffrey R. Yost is CBI Director and HSTM Research Professor at the University of Minnesota. He has published six books (and dozens of articles), most recently Making IT Work: A History of the Computer Services Industry (MIT Press, 2017) and FastLane: Managing Science in the Internet World (Johns Hopkins U. Press, 2016) [co-authored with Thomas J. Misa]. He is a past EIC of IEEE Annals of the History of Computing, and current Series Co-Editor [with Gerard Alberts] of Springer’s History of Computing Book Series.  He has been a principal investigator of a half dozen federally sponsored projects (NSF and DOE) on computing/software history totaling more than $2 million. He is Co-Editor [with Amanda Wick] of Interfaces: Essays and Reviews in Computing & Culture.


 

The Cloud, the Civil War, and the “War on Coal”

Paul E. Ceruzzi, National Air and Space Museum, Smithsonian Institution

Abstract: The term “The Cloud” has entered the lexicon of computer-speak along with “cyberspace,” the Matrix,” the “ether,” and other terms suggesting the immateriality of networked computing. Cloud servers, which store vast amounts of data and software accessible via the Internet, are located around the globe. This essay argues that this “matrix” has an epicenter, namely the former rural village of Ashburn, Virginia. Ashburn’s significance is the result of several factors, including northern Virginia’s historic role in the creation of the Internet and its predecessor, the ARPANET. The Cloud servers located there also exist because of the availability of sources of electric power, including a grid of power lines connected to wind turbines, gas-, and coal-fired plants located to its west—a “networking” of a different type but just as important.

(PDF version available for download.)

 

civil war map
A map of Northeastern Virginia and vicinity of Washington prepared by the Union Army in 1862. Labels insert by Author indicate 'Cloud' sites.

In In his recent book, Making IT Work, Jeffrey Yost quotes a line from the famous Joni Mitchell song, “Clouds”: “I really don’t know clouds at all.” He also quotes the Rolling Stones’ hit, “[Hey, you,] Get off my Cloud.” Why should a business or government agency trust its valuable data to a third-party, whose cloud servers are little understood? No thank you, said the Rolling Stones; not until you can explain to me just what the Cloud is and where it is. Yost gives an excellent account of how cloud servers have come to the fore in current computing. Yet Joni Mitchell’s words still ring true. Do we really know what constitutes the “Cloud”?

A common definition of the Cloud is that of sets of high-capacity servers, scattered across the globe, using high-speed fiber to connect the data stored therein to computing installations. These servers supply data and programs to a range of users, from mission-critical business customers to teenagers sharing photos on their smartphones. What about that definition is cloud-like? Our imperfect understanding of the term is related to the misunderstanding of similar terms also in common use. One is “cyberspace,” the popularity of which is attributed to the Science Fiction author William Gibson, from his novel Neuromancer, published in 1984. Another is “The Matrix”: the title of a path-breaking book on networking by John Quarterman, published in 1990 at the dawn of the networked age. The term came into common use after the award-winning 1999 Warner Brothers film starring Keanu Reeves. (Quarterman was flattered that Hollywood used the term, but he is not sure whether the producers of the film knew of his book.) In the early 1970s, Robert Metcalfe, David Boggs, and colleagues at the Xerox Palo Alto Research Center developed a local area networking system they called “Ethernet”: suggesting the “luminiferous aether” that was once believed to carry light through the cosmos.

These terms suggest an entity divorced from physical objects—pure software independent of underlying hardware. They imply that one may dismiss the hardware component as a given, just as we assume that fresh, drinkable water comes out of the tap when we are thirsty. The residents of Flint, Michigan know that assuming a robust water and sewerage infrastructure is hardly a given, and Nathan Ensmenger has reminded us that the “Cloud” requires a large investment in hardware, including banks of disk drives, air conditioning, fiber connections to the Internet, and above all, a supply of electricity. Yet the perception persists that the cloud, like cyberspace, is out there in the “ether.”

Cloud server, Ashburn.
Cloud server, Ashburn. Note air conditioning units on the roof. Photo by the Author.

Most readers of this journal know the physical infrastructure that sustains Ethernet, Cyberspace, and the Cloud. I will go a step further: not only does the Cloud have a physical presence, but it also has a specific location on the globe: Ashburn, Virginia.

map prepared by the Union Army in 1862 of Northern Virginia shows the village of Farmwell, and nearby Farmwell Station on the Alexandria, Loudoun, and Hampshire railroad. The town later changed its name to Ashburn, and it lies just to the north of Washington Dulles International Airport. In the early 2000s, as I was preparing my study of high technology in northern Virginia, Ashburn was still a farming community. Farmwell Station was by the year 2000 a modest center of Ashburn: a collection of buildings centered on a general store. The railroad had been abandoned in 1968 and was now the Washington and Old Dominion rail-trail, one of the most popular and heavily traveled rails-to-trails conversions in the country. Thirsty hikers and cyclists could get refreshment at the general store, which had also served neighboring farmers with equipment and supplies.

“a:” root server bike trail
Former location of the “a:” root server of the dot.com and dot.org registries (far left), Herndon. Photo taken by the author from the W&OD rail trail.

Cycling along the trail west of Route 28 in 2020, one saw a series of enormous low buildings, each larger than the size of a football field, and surrounded by a mad frenzy of construction, with heavy equipment trucks chewing up the local roads. Overhead was a tangle of high-tension electrical transmission towers, with large substations along the way distributing the power. The frenzy of construction suggested what it was like to have been in Virginia City, Nevada, after the discovery and extraction of the Comstock Lode silver. The buildings themselves had few or no markings on them, but a Google search revealed that one of the main tenants was Equinix, a company that specializes in networking. The tenants of the servers try to avoid publicity, but the local chambers of commerce, politicians, and real estate developers are proud to showcase the economic dynamo of the region. A piece on the local radio station WTOP on November 17, 2020, announced that “Equinix further expands its big Ashburn data center campus,” quoting a company spokesperson saying that “…its Ashburn campus is the densest interconnection hub in the United States.” An earlier broadcast on WTOP reported on the activities of a local real estate developer, that “Northern Virginia remains the ‘King of the Cloud’” In addition to Equinix, the report mentioned several other tenants, including Verizon and Amazon Web Services.

These news accounts are more than just hyperbole from local boosters. Other evidence that indicates that, although cloud servers are scattered across the globe, Ashburn is indeed the navel of the Internet.

In my 2008 study of Tysons Corner, Virginia, I mentioned several factors that led to the rise of what I then called “Internet Alley.” One was the development of ARPANET at the Pentagon, and later at a DARPA office on Wilson Blvd. in Rosslyn. Another was the rise of the proto-Internet company AOL, headquartered in Tysons Corner. Also, in Tysons Corner was the location of “MAE-East”—a network hub that carried a majority of Internet traffic in its early days. The root servers of the dot.com and dot.org registry were once located in the region, with the a: root server in Herndon, later moved to Loudoun County. The region thus had a skilled workforce of network-savvy electrical and computer engineers, plus local firms such as SAIC and Booz-Allen who supported networking as it evolved from its early incarnations.

APRA signs
Plaques at 1401 Wilson Blvd., Rosslyn, commemorating the work of ARPA’s Information Processing Techniques Office, whose offices were in this building. The binary code spells “ARPANET” in 8-bit ASCII. Photos by the author.
 

Around the year 2000, while many were relieved that the “Y2K” bug had little effect on mainframe computers, the dot.com frenzy collapsed. The AOL-Time Warner merger was a mistake. But there was an upside to the boom-and-bust. In the late 19th and early 20th Century the nation experienced a similar boom and bust of railroad construction. Railroads went bankrupt and people lost fortunes, But the activity left behind a robust, if overbuilt, network of railroads that served the nation well during the mid and late 20th century. During the dot.com frenzy, small firms like Metropolitan Fiber dug up many of the roads and streets of Fairfax and Loudoun Counties and laid fiber optic cables, which offered high speed Internet connections. After the bust these became unused— “dark fiber” as it was called. Here was the basis for establishing Cloud servers in Ashburn. By 2010, little land was available in Tysons Corner, Herndon, or Reston, but a little further out along the W&OD rail-trail was plenty of available land.

That leaves the other critical factor in establishing Cloud servers—the availability of electric power. While some Cloud servers are located near sources of wind, solar, or hydroelectric power, such as in the Pacific Northwest, Northern Virginia has few of those resources. The nearest large-scale hydroelectric plant, at the Conowingo Dam, lies about 70 miles to the north, but its power primarily serves the Philadelphia region. (That plant was the focus of the classic work on electric power grids, Networks of Power, by Thomas Parke Hughes.) To answer the question of the sources of power for Ashburn, we return to the Civil War map and its depiction of the Alexandria, Loudoun, and Hampshire, later known as the Washington and Old Dominion Railroad.

The origins of that line go back to the 1840s, when freight, especially coal, from the western counties of Virginia were being diverted to Baltimore, Maryland over the Baltimore and Ohio Railroad. In response, Virginians chartered a route west over the Blue Ridge to the mineral and timber rich areas of Hampshire County. (After 1866 Hampshire County was renamed Mineral County, in the new state of West Virginia.) The Civil War interrupted progress in construction, and after several challenges to its financial structure, the line was incorporated as the Washington and Old Dominion Railway Company in 1911. It never reached farther than the summit of the Blue Ridge, and the proposed route to the west would have had to cross rugged topography. The line could never have competed with the B&O’s water level route. The shortened line soldiered on, until finally being abandoned in 1968, making way for the rail-trail conversion. One interesting exception was a short spur in Alexandria, which carried coal to a power plant on the shore of the Potomac. That plant was decommissioned in 2014, thus ending the rail era of the Alexandria, Loudoun, and Hampshire.

W&OP freight train
Washington & Old Dominion freight train, shortly before the line’s abandonment in 1968.

 

In 1968, the rails-to-trails movement was in its infancy. Most of the freight once carried by rail was now being carried by trucks, and there was little room for rail-dependent industries to survive in a region of fast-growing residential suburban towns. There was little reason to assume that the right of way would revert to local landowners and be developed for commercial and residential use. That was the fate of the line west from Purcellville to the summit of the Blue Ridge at Snickers Gap. But the rest of the right of way was preserved. Shortly before abandonment, the Virginia Electric Power Company entered in to an agreement with Virginia Highway Department to purchase most of the remaining right of way as a conduit for high-voltage power lines, which would supply electric power to Northern Virginia. The agreement was criticized at the time, but among its results was the preservation of the right of way, making way for the establishment of the W&OD rail-trail by the Northern Virginia Regional Park Authority. As mentioned above, the trail is very popular for hiking, cycling, and horseback riding. Most of its users do not mind the overhead power lines above the trail. Given the rapid growth of suburbia in Fairfax and Loudoun counties, the trail could not have had the rural character common to many rail-trails in the country.

The power lines tell us how electric power gets to the Cloud servers. Where the power comes from is more complex. At the time Virginia Electric was negotiating for the right of way, the Engineering Firm Stone and Webster was building a power plant at Mount Storm, in Grant County, West Virginia. The plant was located in the heart of rich coal deposits. Upon its completion, the plant had a capacity of 1,600 Megawatts. Beginning in the early 2000s, the plant’s output was supplemented by a set of wind turbines located along the Allegheny Front – the divide between waters that flow directly to the Atlantic and those that flow to the Ohio and Mississippi Rivers. These turbines supply an additional 264 Megawatts of power.

The Alexandrian, Loudoun, and Hampshire Railroad never was completed far enough west to carry coal from the western mountains. Its charter, however, has been fulfilled, as the right of way now carries energy in the form of electricity generated by coal and wind from those mountains. The railroad’s founders were not thinking of Cloud servers, but today’s Cloud is powered, at least in part, by coal.

End of the line
“End of the Line,” 2014—last remnant of W&OD railroad, used to deliver coal to a power plant in Alexandria, before its decommissioning. Photo by the author.
 

Coal mining in West Virginia and western Maryland is in a precipitous decline. Within a few years it may vanish altogether. Those involved with the construction and management of Data Centers in Loudoun County have stated that those centers will reduce their dependency on coal to zero by the next decade. In addition to converting to natural gas, described below, Virginia is supporting the further development of wind turbines, increasingly located offshore as well as in the mountains. Data centers are also exploring the use of geothermal resources.

These efforts will help reverse disturbing trends of global climate change, but the decline of the coal industry has been devastating to the western Virginia, western Maryland, and West Virginia economy, which is experiencing lay-offs and unemployment among miners and railroad workers. The cause is not the so-called “war on coal,” allegedly waged by Washington politicians. The primary cause is the development of hydraulic fracturing, or “fracking,” of rock, which allows rapid unlocking of natural gas deposits in Appalachia. The technique does require labor, but not on the scale of traditional coal mining. And it is transported not by rail but by pipelines—buried under the ground and largely invisible. Natural gas burns much cleaner than coal.  Fracking has allowed natural gas to supplant coal for most new power plants. It also hastened the conversion of older, coal-fired plants to gas. An 800-Megawatt plant in Dickerson, Maryland, across the Potomac from Loudoun County and another major supplier of energy to the region, converted from coal to gas at the end of 2020. A similar conversion has taken place at the Chalk Point, Maryland plant. As of this writing, the Mont Storm plant remains coal-fired.

Power lines along the bike trail
Power lines, Ashburn. A second set of lines was added after the Panda Stonewall power plant came on line. Photo by the author.

In 2017, the “Panda Stonewall” power plant came on-line. It is located south of Leesburg, a few miles west of Ashburn. The primary market for its 778-Megawatt output is the cloud complex at Ashburn. In promotional literature, the plant’s owner, Panda Power Funds of Dallas, Texas, touts its clean-burning natural gas fuel. The gas is transmitted by pipeline from the Marcellus Shale deposits centered in western Pennsylvania. To handle this new source of power, new substations and overhead lines were built over and beside the W&OD trail from Leesburg to Ashburn.

 Conclusion

The center of the Cloud is in Ashburn, Virginia. It runs on a variety of energy sources, including coal, wind and Marcellus Shale gas deposits. Cloud servers are indeed scattered across the globe, but in Ashburn one can observe first-hand the dramatic transformation of computing. The servers require  electric power, the sources of which: wind, solar, hydro, coal, and gas, all have environmental impacts. In his study of the Cloud, Jeffrey Yost mentioned the two songs by Joni Mitchell and the Rolling Stones. To those I would add a third: the jazz album by the Polish bassist Miroslav Vitus, “Mountain in the Clouds.” The title suggests the serenity and ethereal nature of the cloud, but the music is quite different: a cacophony of clashing instruments, driven by a frenzied drummer and bass line, suggesting the frenzy of cloud construction in Northern Virginia.


Bibliography 

Bechtel Corporation, “Virginia Power Plant is one of the nation’s cleanest.” https://www.bechtel.com/projects/stonewall-energy-center/ accessed 12/10/2020.

Ceruzzi, Paul E. (2008). Internet Alley: High Technology in Tysons Corner, 1945-2005. Cambridge, MA: MIT Press. 

Ensmenger, Nathan. (October 2018). “The Environmental History of Computing,” Technology & Culture, 59/4 Supplement, pp. S7-S33.

Equinix Corporation, “Equinix further expands its big Ashburn data canter campus,” also https://www.equinix.com/data-centers/americas-colocation/. accessed 12/10/2020.

“Crushing it: The world is finally burning less coal. It now faces the challenge of using almost none at all,” The Economist, December 5, 2020, pp. 25-28.

Hughes, Thomas Parke. (1983). Networks of Power: Electrification in Western Society, 1880-1930, Baltimore, Johns Hopkins University Press.

National Public Radio, “Supreme Court Says Pipeline May Cross Underneath Appalachian Trail,” Broadcast June 15, 2020, 6:09 PM ET. https://www.npr.org/2020/06/15/877643195/supreme-court-says-pipeline-may-cross-underneath-appalachian-trail. Accessed 12/10/2020.

Vitous, Miroslav, “Mountain in the Clouds,” Atlantic Records, SD 1622, 1975. Hear a YouTube recording of the initial track: https://www.youtube.com/watch?v=zafIe4Aduus">https://www.youtube.com/watch?v=zafIe4Aduus. Accessed 12/21/2020.

WTOP Radio, “Northern Virginia retains the ‘king of the cloud’,” https://wtop.com/business-finance/2020/09/northern-virginia-remains-the-king-of-the-cloud/ accessed 12/10/2020.

Williams, Ames W. (1984). Washington & Old Dominion Railroad, 1847-1968. Meridian Sun Press, p. 109.

Yost, Jeffrey R. (2017). Making IT Work: A History of the Computer Services Industry. Cambridge, MA: MIT Press.

 

Paul E. Ceruzzi (January 2021). “The Cloud, the Civil War, and the “War on Coal”". Interfaces: Essays and Reviews in Computing and Culture Vol. 2, Charles Babbage Institute, University of Minnesota, 1-11.


About the Author:

Paul Ceruzzi is Emeritus Curator of Aerospace Electronics at the Smithsonian Institution's National Air and Space Museum. He is the author of several books on the history of computing and aerospace, including his most recent GPS​ (MIT Press 2018). His book on high technology in Northern Virginia, Internet Alley​, was published in 2008. He lives with his family in the Maryland suburbs of Washington, DC.