Project Proposal for Mainframes: Present-Past


Mainframes: Present-Past is a web journal collecting scholarship for the humanistic study of mainframe computing. The project hopes to attract a multidisciplinary effort to recover the experience and heritage of computing on mainframe systems in the era before the personal computer (1950s – 1970s). Contributions can vary in focus: archived visual material, economic/social history, general education, business and critical infrastructure studies, interactive emulation of mainframe software, and the cultural poetics of this brand of computing will be welcomed into the project.

Problem addressed

There is a dearth of (digital) humanities scholarship about mainframe computers. Antecedents can be traced in works from various “media archaeologists”: Tung-Hui Hu’s A Prehistory of the Cloud (cultural poetics of cloud computing), Kirschenbaum’s Track Changes: A Literary History of Word Processing (word processing as software), Liu’s The Laws of Cool (knowledge work), Lisa Gitleman’s  “Raw Data” is an Oxymoron (various histories and pre-histories of data), Friedrich Kittler (general theory), among a few others. But mainframe computing is an implementation detail in larger arguments, often glossed over. So much of our computing inherits key concepts from the features and limitations of the age of Big Iron. Surprisingly, critical infrastructure studies practically skips mainframes as a topic, despite continued reliance on mainframes for batch transaction processing (financial transactions), more recently highlighted as a problem area of critical infrastructure when states struggled issue checks during the pandemic as a result of the COBOL programmer shortage.

Project audience

Scholars and students interested in imaginative explorations of what it was like interacting with and encountering mainframe interfaces in person, by proxy (timesharing) and as a cultural phenomenon in popular media and the workplace. I believe there’s a space for educational material for non-technical audiences in the broader public depending on contributor interest.

Contribution and impact

The study of mainframe computing is not only a historical exercise in preservation. Because of the foreignness of mainframe computing from contemporary experience, it becomes a prism to explore larger topics related to information technology in societies. One could study gender in  as American women in programming during this time outnumbered their descents, being seen as “more meticulous” than their male counterparts until the formalization of computer science in engineering departments in the university system. Mainframe computers still play in integral role in critical infrastructure – there’s a reason why there’s a cut-off in the afternoon as to when deposits of checks become valuable in checking accounts, and state payment processing still flows through these systems. Issues of technical knowledge and complexity required to operate mainframes relate to the abstraction of these technical details, touching on workplace casualization . I could imagine an analysis on the effect of line editors in word processing, and the style of business writing exemplified in office memos. The subject of mainframe computing lends itself to different flavors of humanist study too numerous to mention here.

Final product

A web archive/journal about mainframe computing. This site could include writing, collection of visual material and ephemera (manuals, advertising, etc) , and (for the adventurous contributor) interactive emulations of mainframe computing in the browser.

Feasibility assessment

Tool selection

Depending on the contribution and research done, we’ll use either Ed or Wax (thanks minimal computing working group), both of which are based on Jekyll, the static site generator. This choice has a number of benefits in reducing the cost of hosting/deployment (no financial cost if on Github or GitLab) and maintenance (no backend), and is eminently secure (see no backend systems or database). Jekyll is also relatively simple in design, and easy to reason about without too much technical overhead. Use of these tools also allow for preservation in the face of changing web technology, as text is rendered from markdown. The design of Ed and Wax also account for low bandwidth scenarios, a boon for accessibility.

Team composition

In my original proposal, I listed the following roles:

  • One to two researchers
  • One information architect
  • One front end developer (HTML, CSS, Javascript)

The combination of the platform on which this project will be built and the subject matter make team composition flexible though, changing shape based on contribution interest and skill set. I’m confident we’d produce interesting material regardless of team composition. If people are interested in gaining experience in some foundational developer skills like git/Github, terminal usage, HTML/CSS/Javascript, this a project where you could likely learn these skills. But they aren’t required.

Barriers and challenges

I believe the biggest challenges are related to original research and tracking down archival material that meets specific subject matter, though I’d imagine university archives/websites and institutions like The Computer History Museum may help mitigate finding nothing worth writing about.