Archive for computers

Two Comments on Computer History (History of email)

Posted in Uncategorized with tags , , , , , , on June 22, 2011 by spinoza1111

The New York Times has here a series on the “invention” of email: these are my comments at the site, under moderation at this time. Click the orange word “series” to open the series in a different window.

Its-Story

Great stuff BUT: this isn’t a his-story, nor a feminist her-story: it is or should be told as a Marxist its-story.

What do I mean?

I mean that these “histories of software” focus without justification on the East Coast and the Ivy League including MIT. When in fact the SAME experiences were happening simultaneously all over the world, and far from saying, at Roosevelt University’s computer center in 1970 (with its 8K IBM 1401), “hey they have email on multics, we should install CTSS to get it”, we created readme files: punched card decks labeled “run this” which would print instructions to the operator.

And world wide, Russian programmers were reinventing the wheel, Chinese programmers were printing big characters synthesized out of the fixed western characters on what printers they could find, UK programmers were demonstrating that Manchester was better than Oxbridge at computation.

Our managers exhorted us not to reinvent the wheel and paid for us to go to conferences at prestige venues, and bring back magnetic tapes loaded with software. We ignored them in many cases because at that time, it was usually harder to install software written in a different environment than it was to write it from scratch if you were any good, and we were…you had to be to survive early computer science classes, which were utter chaos.

If the University of Chicago had email, we would say, cool, I can program that. And then to the dismay of girlfriends we would spend hours doing precisely that: reinventing the wheel.

It appears from the record that in the Soviet Union, computer managers forbid such frolics because it was the official policy of the Soviet Union not to do theoretical computer science and to follow (or steal) US practice, and this created dissidence and discontent.

But: the active element wasn’t the human element. Technology had taken on a life of its own. One innovation would support another so we as early hackers lost our autonomy. As early as 1976, in his book Computer Power and Human Reason, MIT computer scientist Joseph Weizenbaum called us addicts who were being enslaved to the machine. Two years later, SUNY sociologist Phillip Kraft, in Programmers and Managers, showed the most thoughtful of us the writing on the wall: for the mastery we had over working conditions and even salary was something capitalism would eliminate.

It’s an its-story in which the machine had the last laugh.

Self Promotion?

In #20, masayaNYC says “self-promotional”: I have to agree. We all were employees working on a team.

The article uses a 19th century, Thomas Edison, Claude Monet individualist model to narrate the 1960 computer worker as “in reality” a lone inventor working in a lab, or, in more fevered narratives such as those retailed by Apple, a Bohemian working in a garret late o’ night as his girlfriend slumbers on a messy bed.

The reality was that you sweated the draft, graduated, your father (remembering the depression) would say hey hey now what you young brute, and you’d get hired as a programmer. Yes, there were opportunities to innovate but for the most part, the innovators were ignored or got into hot water.

A single programmer of that time could write an “assembler” or “compiler” or “email”. The problem was that these words meant nothing to the institution, and the mission was to create an “operating system” something which was by definition beyond the ability of mere mortals at the time to create by themselves…beyond something like the early unix which was incomprehensible to nonprogrammers.

Multics was late and nearly (but not quite) dead on arrival. In response a team developed a downsized version circa 1970 which became unix. This was redesigned ten years later for the IBM PC, and the redesign was essentially stolen to become Linux from a comp sci prof by an ethically challenged graduate student.

The real effort was collective (pace Ayn Rand). The modal foot soldier in that effort was NOT a young male working all night. Many women worked very hard but keeping reasonable hours at coding and documentation often out-performing the young dudes because they realized their work needed to be used by others. Other women compassionately managed and protected the often asocial young males: my supervisor at Princeton, for example, had to explain to higher management that just because I would swear at computers did NOT mean I was dangerous to be around.

The American-Dutch computer scientist Edsger Wybe Dijkstra died in 2002 and is almost forgotten (he badly needs a scientific biography like that of John “A Beautiful Mind” Nash). He did invent many useful things and ideas, but in the modern era, things ride us far more than in Edison’s time, so Dijkstra’s insight was (in he spirit of a thinker Dijsktra probably knew nothing about, Theodore Adorno) essentially negative: it was that the systems we’d grown able to develop courtesy of the blind evolution of technology had just outrun our capability to master them, and to become enchanted by our apparent mastery-of, invention-of details such as email or Powerpoint was childish.

The very evolution of the artifacts has changed human epistemology to the point where we don’t credit the output of the system. Instead of “blind acceptance of authority” the inverse problem, in which we renarrate our ignorance as a more creditable skepticism, occurs. For example, the official report on 9-11 uses the phrase “the system was blinking red” to summarize what our complex tools were telling us: but by that time, we “knew”, too well, that the artifact might be wrong.

I mean: I invented the mobile phone in 1979. No, scratch that. I was engaged as a consultant to develop tools to help develop tools at Motorola in Schaumburg to produce, in fact, the first viable, workable, saleble “brick” as seen in 1980s movies usually in the hands of scumbags on yachts. I did the work but the core team rejected it because it ran on a mainframe and they, rightly in my opinion, wanted to use the Z-80 chip to write the tools as well as the operating system for mobile telephony. That is called “eating your own dog food” and they were right.

But as an employee in our system it was my job to do it wrong, and by that time I had kids.

This, perhaps, is why people will post remarks to the effect that the “winners” (the sort of people who write the history) are self-serving. As Joe Weizenbaum, the MIT professor who saw the toxic and addictive side of hacking in 1976 in Computer Power and Human Reason, saw, no man is an island: we hackers depended on lights, power and air conditioning, and we were in denial.

Which is why, I have come to believe, flaming and bullying has increased on hacker chat sites. It’s because the autonomy has disappeared. In one example I have examined on this blog, a “hacker” with absolutely no academic preparation for computer science, an essentially clerical job, who is unable to code and apparently takes a perverse pride in it, posted an attack on a degreed computer author in the 1990s which “went viral”.

We didn’t waste our time with this shit back in the 1970s to the precise extent we had autonomy and could develop interesting things.

Corporate Decision

George Tooker, Corporate Decision

We sought an escape from an Hegelian contradiction. In our name, McNamara had used operations research and mathematics to set half of Japan on fire with countless deaths, and Dresden had been destroyed. Our fathers had reconciled with this system and were damned if we should live at home on their earnings.

We wanted to humanize the inhuman and in Captain America’s words in Easy Rider, we blew it. In seeking an autonomy which our mothers, girlfriends and wives did not have we destroyed autonomy.

In accounting offices of the 1960s, a clerk with good skills could spot malfeasance and misfeasance. But in the 2000s, not even programmers could notice that spreadsheet A for contract A depended on spreadsheet B for contract B, and B depended on C, and C depended on…A. Toxic loans were bundled and sold as gold. The result? The rich are becoming the superrich and a broad based attack on public goods ensued.

Further thoughts on Stanley Fish’s essay “Fathers, sons and motorcycles”

Posted in Uncategorized with tags , , , , , on June 19, 2009 by spinoza1111

Awaiting moderation here. Somewhat extended and modified at this site.

Hmm, people here talk as if temporary *bricolage*-style repairs to cars are a good thing in all or most cases, somehow more Authentic than a by-the-book repair done by a corporate oil-change or repair shop, in a Jargon of Authenticity.

This is to mistake Pirsig, who was in favor of a balance of theory and practice, not all praxis all the time.

I lived in the Midwest too long, and for this reason rode one too many cars foul with carbon monoxide, whose owners bragged about all the money they saved by means of their shop praxis. I have seen one too many Jesus-is-my-car-insurance beaters and junkers catch fire on the side of the Edens expressway, whose owners usually claim, loudly, that they know “all about” cars, especially to their women-folk.

[To his credit, Stanley admits being scared of motorcycles, thus exiting the male language-game. But he’s still embed within the language game of class division.]

As I have said, the by-the-book approach is a Godsend to people who simply want to maintain their car properly in the absence of civilized public transit. As I have said, for every mute unsung Milton or Dalai Lama operating a car-repair shop there is a lunatic who tells anyone who will listen of his mystical Secret Knowledge (as if Secret Knowledge isn’t ultimately a contradiction) while using the wrong weight to change your oil.

In Stanley’s example, we are left wondering as regards the air conditioning system of the Porsche, why it demands a different mixture of Freon (or whatever) than specified in the shop manual. If the Porsche is so great (and costs so much), why don’t its owners and their repairmen get a corrected shop manual? The shop manuals for my Ford Escort were great, and I for one have always loved reading and hewing-unto manuals, possibly because of my German ancestry. But part of the Romanticism so unquestioned in the United States is not reading the manual; as a result, many car and computer manuals are unavailable, or available only online.

Shopping for a personal computer was a delight in 1979, when the likes of a Ted Nelson, the Oxford-educated creator of Xanadu, sold computers in my home town of Evanston, Illinois: shopping for a personal computer, became a nightmare when, *circa* 1985, the shopclerks became pimply, arrogant and ignorant little creeps who claimed to know “all about” computers, even though they did not have the verbal skills to express their Secret Knowledge (unlike Ted), its ineffability making it again quite a paradox.

Pirsig himself was not prey to the Romanticism of orality, any more than he was overinvested in writing when writing was incorrect. In fact, it appears from ZATAOMM that he spent time in Minnesota as a technical writer (my guess is for the old Control Data computer corporation), and on the job was appalled by the slapdash and incorrect writing of his mates. If the manual is wrong, the ultimate cure is more writing, whether new pages for insertion in the three-ring binder, or what Derrida would call arche-writing: the important words of the guy (or gal) in the shop who happens to know the manual is wrong, which should be written down ASAP.

And with all due respect to the gentleman with “30 years of experience in software”, I have 30 years of experience too, and I wrote [incipit shameless plug] “Build Your Own .Net Language and Compiler” (Apress 2004).

I soon learned that the type of programmer who claims also to be a handy man about the house or trailer is usually precisely that sort of programmer who is underinformed on computer science and for this reason overfond of “hacking”, where “hacking” is creating software with errors that seems to run.

Pirsig sets his face against this orality in favor of writing: but he found that in the zones and spaces of writing (such as, perhaps, Control Data and the University of Chicago) a hollow committment to simple truth, and a good-enough-for-government-work attitude.

We’re paying the price, bigtime, for this attitude, since part of the “credit crisis” was the creation of software which enabled the creation of vastly overcomplicated financial promises, derivatives and instruments: CDOs and derivatives are themselves mysterious “writing”, typically in the forms of marks in incomprehensible, almost inaccessible, data bases and arcane programming languages, that needed documentation, but there was no time, during the fat and swinish years, for “documentation”.

Pirsig wondered if people, surrounded by vastly complex systems, could actually keep those systems going. Apparently, this isn’t the case.