Two Comments on Computer History (History of email)

The New York Times has here a series on the “invention” of email: these are my comments at the site, under moderation at this time. Click the orange word “series” to open the series in a different window.

Its-Story

Great stuff BUT: this isn’t a his-story, nor a feminist her-story: it is or should be told as a Marxist its-story.

What do I mean?

I mean that these “histories of software” focus without justification on the East Coast and the Ivy League including MIT. When in fact the SAME experiences were happening simultaneously all over the world, and far from saying, at Roosevelt University’s computer center in 1970 (with its 8K IBM 1401), “hey they have email on multics, we should install CTSS to get it”, we created readme files: punched card decks labeled “run this” which would print instructions to the operator.

And world wide, Russian programmers were reinventing the wheel, Chinese programmers were printing big characters synthesized out of the fixed western characters on what printers they could find, UK programmers were demonstrating that Manchester was better than Oxbridge at computation.

Our managers exhorted us not to reinvent the wheel and paid for us to go to conferences at prestige venues, and bring back magnetic tapes loaded with software. We ignored them in many cases because at that time, it was usually harder to install software written in a different environment than it was to write it from scratch if you were any good, and we were…you had to be to survive early computer science classes, which were utter chaos.

If the University of Chicago had email, we would say, cool, I can program that. And then to the dismay of girlfriends we would spend hours doing precisely that: reinventing the wheel.

It appears from the record that in the Soviet Union, computer managers forbid such frolics because it was the official policy of the Soviet Union not to do theoretical computer science and to follow (or steal) US practice, and this created dissidence and discontent.

But: the active element wasn’t the human element. Technology had taken on a life of its own. One innovation would support another so we as early hackers lost our autonomy. As early as 1976, in his book Computer Power and Human Reason, MIT computer scientist Joseph Weizenbaum called us addicts who were being enslaved to the machine. Two years later, SUNY sociologist Phillip Kraft, in Programmers and Managers, showed the most thoughtful of us the writing on the wall: for the mastery we had over working conditions and even salary was something capitalism would eliminate.

It’s an its-story in which the machine had the last laugh.

Self Promotion?

In #20, masayaNYC says “self-promotional”: I have to agree. We all were employees working on a team.

The article uses a 19th century, Thomas Edison, Claude Monet individualist model to narrate the 1960 computer worker as “in reality” a lone inventor working in a lab, or, in more fevered narratives such as those retailed by Apple, a Bohemian working in a garret late o’ night as his girlfriend slumbers on a messy bed.

The reality was that you sweated the draft, graduated, your father (remembering the depression) would say hey hey now what you young brute, and you’d get hired as a programmer. Yes, there were opportunities to innovate but for the most part, the innovators were ignored or got into hot water.

A single programmer of that time could write an “assembler” or “compiler” or “email”. The problem was that these words meant nothing to the institution, and the mission was to create an “operating system” something which was by definition beyond the ability of mere mortals at the time to create by themselves…beyond something like the early unix which was incomprehensible to nonprogrammers.

Multics was late and nearly (but not quite) dead on arrival. In response a team developed a downsized version circa 1970 which became unix. This was redesigned ten years later for the IBM PC, and the redesign was essentially stolen to become Linux from a comp sci prof by an ethically challenged graduate student.

The real effort was collective (pace Ayn Rand). The modal foot soldier in that effort was NOT a young male working all night. Many women worked very hard but keeping reasonable hours at coding and documentation often out-performing the young dudes because they realized their work needed to be used by others. Other women compassionately managed and protected the often asocial young males: my supervisor at Princeton, for example, had to explain to higher management that just because I would swear at computers did NOT mean I was dangerous to be around.

The American-Dutch computer scientist Edsger Wybe Dijkstra died in 2002 and is almost forgotten (he badly needs a scientific biography like that of John “A Beautiful Mind” Nash). He did invent many useful things and ideas, but in the modern era, things ride us far more than in Edison’s time, so Dijkstra’s insight was (in he spirit of a thinker Dijsktra probably knew nothing about, Theodore Adorno) essentially negative: it was that the systems we’d grown able to develop courtesy of the blind evolution of technology had just outrun our capability to master them, and to become enchanted by our apparent mastery-of, invention-of details such as email or Powerpoint was childish.

The very evolution of the artifacts has changed human epistemology to the point where we don’t credit the output of the system. Instead of “blind acceptance of authority” the inverse problem, in which we renarrate our ignorance as a more creditable skepticism, occurs. For example, the official report on 9-11 uses the phrase “the system was blinking red” to summarize what our complex tools were telling us: but by that time, we “knew”, too well, that the artifact might be wrong.

I mean: I invented the mobile phone in 1979. No, scratch that. I was engaged as a consultant to develop tools to help develop tools at Motorola in Schaumburg to produce, in fact, the first viable, workable, saleble “brick” as seen in 1980s movies usually in the hands of scumbags on yachts. I did the work but the core team rejected it because it ran on a mainframe and they, rightly in my opinion, wanted to use the Z-80 chip to write the tools as well as the operating system for mobile telephony. That is called “eating your own dog food” and they were right.

But as an employee in our system it was my job to do it wrong, and by that time I had kids.

This, perhaps, is why people will post remarks to the effect that the “winners” (the sort of people who write the history) are self-serving. As Joe Weizenbaum, the MIT professor who saw the toxic and addictive side of hacking in 1976 in Computer Power and Human Reason, saw, no man is an island: we hackers depended on lights, power and air conditioning, and we were in denial.

Which is why, I have come to believe, flaming and bullying has increased on hacker chat sites. It’s because the autonomy has disappeared. In one example I have examined on this blog, a “hacker” with absolutely no academic preparation for computer science, an essentially clerical job, who is unable to code and apparently takes a perverse pride in it, posted an attack on a degreed computer author in the 1990s which “went viral”.

We didn’t waste our time with this shit back in the 1970s to the precise extent we had autonomy and could develop interesting things.

Corporate Decision

George Tooker, Corporate Decision

We sought an escape from an Hegelian contradiction. In our name, McNamara had used operations research and mathematics to set half of Japan on fire with countless deaths, and Dresden had been destroyed. Our fathers had reconciled with this system and were damned if we should live at home on their earnings.

We wanted to humanize the inhuman and in Captain America’s words in Easy Rider, we blew it. In seeking an autonomy which our mothers, girlfriends and wives did not have we destroyed autonomy.

In accounting offices of the 1960s, a clerk with good skills could spot malfeasance and misfeasance. But in the 2000s, not even programmers could notice that spreadsheet A for contract A depended on spreadsheet B for contract B, and B depended on C, and C depended on…A. Toxic loans were bundled and sold as gold. The result? The rich are becoming the superrich and a broad based attack on public goods ensued.

3 Responses to “Two Comments on Computer History (History of email)”

  1. For what it’s worth, I’m with you on the collectivity of software production. The myth of the lone hacker has gone on too long, but the rub is that, it’s a convenient myth. Telling the history of software is still to challenging, with still too few conceptual tools to do much else. I’m presenting a paper at 4S in Cleveland in a few months that will make my modest first step at addressing some of these issues. Instead of employing the lone hacker myth (and eliding all the privilege, women, and Others that contributed), I’m working with a factory production model. My hope is that a careful look at software engineering (of which Dijkstra practically invented as a discipline) will tell a tale of a different mode of software production.

  2. spinoza1111 Says:

    I found that reading Adorno in parallel with Dijkstra helps since they both were intellectual expatriates. Adorno insisted that the individual and his subjectivity survives but embedded all the time. Thus, the lone hacker might, at the last minute, complete the system, get it done, but he’s standing on the shoulders of giants, midgets, dwarves and gofers.

    Which is why at the end of my software career I preferred “pairwise” development. I actually liked sitting with another person and using speech to clarify software, for I’d found that my individual subjectivity and creativity often created the wrong solution.

  3. spinoza1111 Says:

    Take a look at David Noble’s Forces of Production, a history of the participation of labor unions in machine tool automation. Machinists like my great grandfather were creative human subjects who nonetheless worked in factories and had to get along with others. Management told them to stop setting up tools and use paper tapes created by college boys in the office. These paper tapes created scrap at high speed. The machinists, as individual, creative subjects, fought collectively for Recognition and won the right to do the setup.

Leave a comment