Skip to content

How computers began to slowly replace humans | David Alan Grier: Full Interview

Historian David Alan Grier argues that computing history is really a history of labor and standardization. From 18th-century star charts to modern AI, discover how the systematization of production during the Industrial Revolution paved the way for computers to replace human effort.

Table of Contents

When we think of the Industrial Revolution, we typically picture steam engines, textile mills, and the physical transformation of raw materials. However, historian and author David Alan Grier argues that the true essence of this era was the systematization of production—a concept that applies as much to information as it does to cotton or steel. The history of the computer is not merely a history of electronics; it is a history of labor, standardization, and the relentless human drive to organize the world into predictable, error-free processes.

From the star charts of the 18th century to the artificial intelligence algorithms of today, the evolution of computing has been defined by the need to handle scale. Whether tracking Halley’s Comet or mapping the human genome, the fundamental challenge remains the same: how to reduce vast amounts of complex data into actionable insights at the lowest possible cost.

Key Takeaways

  • Computing is an industrial process: The field originated not from abstract mathematics, but from the need to systematize labor and produce uniform results, akin to factory production.
  • Division of labor drove innovation: Early "computers" were humans. To manage complex calculations, tasks were broken down into simple steps that required less skilled labor, paving the way for mechanization.
  • Standardization was a prerequisite: Before computers could communicate, the world needed standard time zones, educational metrics, and manufacturing specifications.
  • Data changes self-perception: Every major leap in data processing—from the 1890 Census to modern social media—has fundamentally altered how humans view themselves and their place in society.
  • The conflict over skill ownership persists: The tension between skilled workers and automation, first seen with 1950s machinists, mirrors today’s debates over AI and data ownership.

The Industrial Roots of Calculation

The year 1776 is famous for the American Declaration of Independence, but it also marked the publication of Adam Smith’s The Wealth of Nations. Smith’s opening chapters detailed the division of labor and the systematization of work—principles that became the bedrock of the Industrial Revolution. Interestingly, these industrial concepts were immediately embraced by the scientific community, particularly astronomers, who faced a massive data problem.

Mapping the Heavens

In the 18th century, astronomy was the primary tool for navigation. Observatories were staffed to map the stars night after night, generating mountains of raw data that needed to be reduced to absolute coordinates. This was a repetitive, high-volume task that required a factory-like approach to arithmetic. To handle the return of Halley’s Comet, French astronomers divided the labor: different teams calculated the gravitational influence of the Earth and Jupiter, while others tracked the comet itself. This division of labor proved that complex problems could be solved by breaking them into smaller, parallel tasks.

The Problem of Error

As the demand for data grew—driven by nautical almanacs for global trade and land surveys for expanding nations—so did the potential for error. Charles Babbage, a pioneer of mechanical calculation, identified a phenomenon that plagued human computers.

Charles Babbage discovered what he called Babbage's Rule, which is two calculations done the same way by different people will tend to make the same errors. There seems to be, in the process of hand calculation, mistakes that trip up everybody.

To combat this, the early industrialists of computing developed systems where calculations were done in different ways to expose errors. This focus on process and verification laid the intellectual groundwork for the mechanical devices that would follow.

Standardization: The Invisible Infrastructure

For computing to scale, the world needed to be standardized. This went beyond nuts and bolts; it required the standardization of time, education, and information itself. The telegraph, which connected cities like Washington D.C. and Baltimore, immediately highlighted the issue of local time differences. To safely coordinate railroad schedules and prevent collisions, a unified system of time was essential. This seemingly administrative shift was, in reality, a massive data synchronization project.

Standardizing Education and the Workforce

By the early 20th century, standardization reached the educational system. The Carnegie Institution and the Flexner Report revolutionized how professionals were trained, establishing credit hours and standard curricula. This had unintended but profound consequences for the workforce. As the medical profession was standardized, women—who had previously played significant roles in medicine—were largely pushed out.

Mathematics, desperate for enrollment, welcomed these women. Consequently, high school mathematics and human computation became "feminized" fields. When the demand for calculation exploded during the World Wars, there was a ready workforce of educated women prepared to step into roles as human computers.

The Census and the Psychology of Data

As the United States grew, the constitutional mandate to count the population every ten years became an impossible logistical challenge. The 1880 census was practically unfinished when the 1890 census was due to begin. The solution was the Hollerith Tabulating Machine, a device that used punch cards to tally data electronically. This was the precursor to IBM.

The impact of the 1890 census was not just administrative; it was psychological. The speed of the tabulation allowed historians to declare the American frontier "closed" just three years after the count began. Furthermore, the punch card created a new abstraction of the human being.

He would hold it up and he felt that when he was putting it through the tabulator, he was working out the future of that person. And the bell which indicated that the card had been processed was the bell calling that soul to judgment of heaven or hell.

This marked the beginning of a modern phenomenon: falling in love with the data representation of ourselves. Just as census workers saw "souls" in punch cards, modern users see their identity reflected in their personal computers and social media profiles.

War, Architecture, and the Displacement of Humans

World War II accelerated the transition from human calculation to electronic computing. The problem of ballistics—specifically, shooting down moving aircraft—required differential calculus performed at speeds human beings could not match. The ENIAC (Electronic Numerical Integrator and Calculator) was developed to solve these firing tables.

The Von Neumann Architecture

While the ENIAC was a marvel of engineering, the conceptual breakthrough came from John von Neumann. Improving upon the work of the ENIAC team, von Neumann abstracted the machine into three distinct components that define computers to this day:

  1. Memory: A place to store numbers and data.
  2. Processing Unit: A component to perform arithmetic and logic operations.
  3. Instruction Decoder: A system to read a program and direct the traffic of information.

The Human Cost

This era also highlighted the complicated relationship between human labor and machines. The "Math Tables Project" in New York employed hundreds of human computers—largely poor, unemployed workers during the Depression—to perform high-level calculations for the war effort, including the Manhattan Project. These workers took immense pride in their connection to these grand scientific endeavors. However, as machines became more capable, this class of labor was displaced, a cycle that continues to repeat as technology advances.

The Modern Era: From ARPANET to AI

The post-war era saw the development of ARPANET, the precursor to the Internet, which solved the problem of sharing computing resources across distances. Crucially, it introduced the concept of "search" as a fundamental computer science problem. It wasn't enough to store data; one had to find it.

The Struggle for Ownership

As we moved into the age of automation and now Artificial Intelligence, the conflict has shifted from physical labor to the ownership of skill and data. In the 1950s, machinists protested against automated tools that "recorded" their skilled movements to replay them later. They argued that their skill was their property. Today, this battle is fought over the data used to train AI models.

The question becomes, who owns the activities that are being captured by data? ... It's not clear that this fight has been resolved yet. What do we have that we own of ourselves, own of our actions, own of our thoughts?

Conclusion

The history of computing is a U-shaped narrative of hype, failure, and eventual reliable utility. We are currently in the "hype" phase of generative AI, captivated by the technology's ability to mirror human behavior, much like the census workers were captivated by punch cards. However, the lesson of history is that technology eventually settles into the background, becoming a standardized, reliable tool.

From the human computers mapping the stars to the neural networks generating text, the goal remains unchanged: the systematization of the world to produce uniform quality at the lowest possible cost. As we integrate these new systems, we must continually ask not just what the machine can do, but who owns the data it processes and how it reshapes our understanding of ourselves.

Latest

The Future Of Brain-Computer Interfaces

The Future Of Brain-Computer Interfaces

We are entering a 'takeoff era' for brain-computer interfaces. Discover how BCI technology is shifting from restoring physical function to unlocking cognitive enhancement and redefining the human condition.

Members Public