Archive for MIT

Two Comments on Computer History (History of email)

Posted in Uncategorized with tags , , , , , , on June 22, 2011 by spinoza1111

The New York Times has here a series on the “invention” of email: these are my comments at the site, under moderation at this time. Click the orange word “series” to open the series in a different window.

Its-Story

Great stuff BUT: this isn’t a his-story, nor a feminist her-story: it is or should be told as a Marxist its-story.

What do I mean?

I mean that these “histories of software” focus without justification on the East Coast and the Ivy League including MIT. When in fact the SAME experiences were happening simultaneously all over the world, and far from saying, at Roosevelt University’s computer center in 1970 (with its 8K IBM 1401), “hey they have email on multics, we should install CTSS to get it”, we created readme files: punched card decks labeled “run this” which would print instructions to the operator.

And world wide, Russian programmers were reinventing the wheel, Chinese programmers were printing big characters synthesized out of the fixed western characters on what printers they could find, UK programmers were demonstrating that Manchester was better than Oxbridge at computation.

Our managers exhorted us not to reinvent the wheel and paid for us to go to conferences at prestige venues, and bring back magnetic tapes loaded with software. We ignored them in many cases because at that time, it was usually harder to install software written in a different environment than it was to write it from scratch if you were any good, and we were…you had to be to survive early computer science classes, which were utter chaos.

If the University of Chicago had email, we would say, cool, I can program that. And then to the dismay of girlfriends we would spend hours doing precisely that: reinventing the wheel.

It appears from the record that in the Soviet Union, computer managers forbid such frolics because it was the official policy of the Soviet Union not to do theoretical computer science and to follow (or steal) US practice, and this created dissidence and discontent.

But: the active element wasn’t the human element. Technology had taken on a life of its own. One innovation would support another so we as early hackers lost our autonomy. As early as 1976, in his book Computer Power and Human Reason, MIT computer scientist Joseph Weizenbaum called us addicts who were being enslaved to the machine. Two years later, SUNY sociologist Phillip Kraft, in Programmers and Managers, showed the most thoughtful of us the writing on the wall: for the mastery we had over working conditions and even salary was something capitalism would eliminate.

It’s an its-story in which the machine had the last laugh.

Self Promotion?

In #20, masayaNYC says “self-promotional”: I have to agree. We all were employees working on a team.

The article uses a 19th century, Thomas Edison, Claude Monet individualist model to narrate the 1960 computer worker as “in reality” a lone inventor working in a lab, or, in more fevered narratives such as those retailed by Apple, a Bohemian working in a garret late o’ night as his girlfriend slumbers on a messy bed.

The reality was that you sweated the draft, graduated, your father (remembering the depression) would say hey hey now what you young brute, and you’d get hired as a programmer. Yes, there were opportunities to innovate but for the most part, the innovators were ignored or got into hot water.

A single programmer of that time could write an “assembler” or “compiler” or “email”. The problem was that these words meant nothing to the institution, and the mission was to create an “operating system” something which was by definition beyond the ability of mere mortals at the time to create by themselves…beyond something like the early unix which was incomprehensible to nonprogrammers.

Multics was late and nearly (but not quite) dead on arrival. In response a team developed a downsized version circa 1970 which became unix. This was redesigned ten years later for the IBM PC, and the redesign was essentially stolen to become Linux from a comp sci prof by an ethically challenged graduate student.

The real effort was collective (pace Ayn Rand). The modal foot soldier in that effort was NOT a young male working all night. Many women worked very hard but keeping reasonable hours at coding and documentation often out-performing the young dudes because they realized their work needed to be used by others. Other women compassionately managed and protected the often asocial young males: my supervisor at Princeton, for example, had to explain to higher management that just because I would swear at computers did NOT mean I was dangerous to be around.

The American-Dutch computer scientist Edsger Wybe Dijkstra died in 2002 and is almost forgotten (he badly needs a scientific biography like that of John “A Beautiful Mind” Nash). He did invent many useful things and ideas, but in the modern era, things ride us far more than in Edison’s time, so Dijkstra’s insight was (in he spirit of a thinker Dijsktra probably knew nothing about, Theodore Adorno) essentially negative: it was that the systems we’d grown able to develop courtesy of the blind evolution of technology had just outrun our capability to master them, and to become enchanted by our apparent mastery-of, invention-of details such as email or Powerpoint was childish.

The very evolution of the artifacts has changed human epistemology to the point where we don’t credit the output of the system. Instead of “blind acceptance of authority” the inverse problem, in which we renarrate our ignorance as a more creditable skepticism, occurs. For example, the official report on 9-11 uses the phrase “the system was blinking red” to summarize what our complex tools were telling us: but by that time, we “knew”, too well, that the artifact might be wrong.

I mean: I invented the mobile phone in 1979. No, scratch that. I was engaged as a consultant to develop tools to help develop tools at Motorola in Schaumburg to produce, in fact, the first viable, workable, saleble “brick” as seen in 1980s movies usually in the hands of scumbags on yachts. I did the work but the core team rejected it because it ran on a mainframe and they, rightly in my opinion, wanted to use the Z-80 chip to write the tools as well as the operating system for mobile telephony. That is called “eating your own dog food” and they were right.

But as an employee in our system it was my job to do it wrong, and by that time I had kids.

This, perhaps, is why people will post remarks to the effect that the “winners” (the sort of people who write the history) are self-serving. As Joe Weizenbaum, the MIT professor who saw the toxic and addictive side of hacking in 1976 in Computer Power and Human Reason, saw, no man is an island: we hackers depended on lights, power and air conditioning, and we were in denial.

Which is why, I have come to believe, flaming and bullying has increased on hacker chat sites. It’s because the autonomy has disappeared. In one example I have examined on this blog, a “hacker” with absolutely no academic preparation for computer science, an essentially clerical job, who is unable to code and apparently takes a perverse pride in it, posted an attack on a degreed computer author in the 1990s which “went viral”.

We didn’t waste our time with this shit back in the 1970s to the precise extent we had autonomy and could develop interesting things.

Corporate Decision

George Tooker, Corporate Decision

We sought an escape from an Hegelian contradiction. In our name, McNamara had used operations research and mathematics to set half of Japan on fire with countless deaths, and Dresden had been destroyed. Our fathers had reconciled with this system and were damned if we should live at home on their earnings.

We wanted to humanize the inhuman and in Captain America’s words in Easy Rider, we blew it. In seeking an autonomy which our mothers, girlfriends and wives did not have we destroyed autonomy.

In accounting offices of the 1960s, a clerk with good skills could spot malfeasance and misfeasance. But in the 2000s, not even programmers could notice that spreadsheet A for contract A depended on spreadsheet B for contract B, and B depended on C, and C depended on…A. Toxic loans were bundled and sold as gold. The result? The rich are becoming the superrich and a broad based attack on public goods ensued.

Let us now praise famous men…Prof Walter Lewin

Posted in Uncategorized with tags , , , , on May 10, 2009 by spinoza1111

Professor Walter Lewin

I am reading about physics at MIT Open Courseware. Sure, I took the “hard” physics class at Roosevelt University in 1970 under a great man, who used to be found after classes having a couple of blasts at Jimmy Wong’s across Wabash, but I need to review the basics in order to understand quantum computation.

Prof Walter Lewin is here demonstrating conservation of mechanical energy. He knows the heavy iron ball will not kill him since he has calculated that it must be so (ess muss sein), but he is a brave man in view of quantum theory alone. As a Dutchman, he is in the tradition of Edsger Dijkstra who believed that you could prove things about software.

I sensed at Princeton that below the level of the arrogant superstar, for whose courses students sign-up only to find themselves taught by bitter, twisted, and prematurely aged graduate students, there are adjunct, associate and full professors who have made it their life’s work to teach without necessarily sacrificing research as it says they must in the playbook.

ED Klemke chaired the department of Philosophy at Roosevelt University before it was re-engineered by President Rolf Weil from collective social mobility (“education for freedom”) to individual social mobility (“just watch me”). He left in disgust at Weil’s down-sizing of humanities for the state university of Iowa, and taught undergraduate sections while publishing. But when he was nominated for a lifetime achievement award, it was turned-down, for he had “published too much”.

That is, as far as I can determine, Klemke had refused to stay within neat lines drawn in higher education, lines intended to preserve a class society in which the sons and daughters of the superrich, and selected sons and daughters of the poor, study under professors who publish and their graduate assistants, while students at the state university of Iowa need to study under professors who never have had, nor have written, an original thought.

Walter Lewin of MIT, and in my experience mathematics professor Hale Trotter of Princeton, are dedicated teachers who’ve also published but they are in no sense “public intellectuals” like Chomsky, also at MIT.

Many “public intellectuals”, however, could not explain the basics of their field and one senses, at times, that they are weary (at a certain level of subgenius) of their lives and fields.

An Einstein was happy to talk about basic maths to schoolchildren who wrote to him, but I noticed, when Noam Chomsky was “available for questions” on the old Z Magazine bulletin board system, that people who asked Chomsky about the relationship of his science to his politics, and of both to French theory, were asking for trouble.

This is because Chomsky had had, about ten years prior, the experience basically provincial Americans, provincial in their assumption that they speak for all mankind, have in France, when they meet public intellectuals who believe that they speak for all mankind. Chomsky concluded that Foucault and Lacan were frauds because neither accepted the need to be respectfully silent about scientific results.

On the Z BBS, Chomsky’s questions were channeled by a certain Mike Albert, who’d been a student activist in the 1960s, but seemed at the time to treat Chomsky as some sort of god in a cult of personality, who couldn’t be bothered with real questions but who instead should be permitted to basically repeat Chomsky’s ill-informed, indeed Utopian, political “philosophy”…basically, a naive anarchism of the sort that doesn’t survive holding down a real job in the real world, serving in the military, or raising children without enough money.

Real professors aren’t media superstars. Like Klemke they drop dead in front of real survey classes after years of work in obscurity.

And no, they don’t date their students. I teach six days a week and I notice that precisely because (as I pointed out to friend Stanley Fish) the teacher-student relationship goes back considerably further than the modern arms-length contract, there are erotic elements in teaching which constitute in a world of commodities a thing with aura in Walter Benjamin’s sense, or to put it in my own elegant way, a stick of shitfire with a wick on it. Which is why the casual prof’s assumption in many universities that he has sexual access to his students should be a termination offense even if he’s tenured.

Foucault tried to get Chomsky wise to the fact that anything could be a “capillary”, transmitting power; precisely to the extent that Princeton graduate students felt themselves unmanned by being invited into a priestly caste (with the subconscious association of the priest with the eunuch), they seemed to me to feel the need to transmit the most odious forms of power, as in the Eighties catchphrase template “I’m a/an [insert name of learned profession] and you’re not”. But Chomsky insisted that no, people left to their own devices will happily cooperate like fawns prancing about Arcadia, and the result was that he needed Mike Albert to do his dirty work for him.

Any hint of sexual harassment works homeopathically, because it introduces particles of the belief that the sublimation never existed. Plato may or may not have been “gay”. It’s perfectly possible, as the recent movie “The Forty Year Old Virgin” suggests, that we can sublimate for years, and it’s also possible, as that movie does not suggest, that desublimation, rather than pure pleasure-principle, becomes another Foucauldian category of power. The “Forty Year Old Virgin’s” coworkers in fact brutally socialise him in one reading of the movie, forcing him to shoulder a sort of white man’s burden in which he, like they, compulsively pursue post-feminist women who despise and mock them.

The modern Dr Phil and Ophrah message is that what was “sex” in my youth, what was a lot of fun and without consequences in 1969 when I lost my virginity, is now a “relationship”, and that the male of the species must “work” at this “relationship”. Wow. In my day, the male of the species, if he felt a calling to be a miserable SOB, didn’t get into a “relationship”. He joined the United States Marines or Quigley Seminary in Chicago.

But this “work” is structurally akin to Stalinism in the sense that in Stalinism, the original revolutionary promise of Lenin’s last days, a very brief period of artistic experimentation and NEP markets, was withdrawn in favor of a return to exchange relations, now become nightmarish in that just as you never had enough money saved under the Tsar, it was almost impossible to be the “perfect worker”.

Likewise, in today’s Pop psychology (changes in which are reflective of deeper and more seismic shifts) you’re not “OK”, especially if you’re a heterosexual male. You’ve become the bourgeois or landlord class whether you like it or not.

Over time, these mental structures become a sort of collective farce or dream in which the awakening feels like a new dawn, but then the awakening in turn becomes the New Bullshit. It’s dialectical.

The “sexual liberation” of 1969 (when a terribly nice girl took my virginity on the same night Neil Armstrong and Buzz Aldrin landed on the Moon) became sexual harassment of students by teachers. Foucault was in other words prescient, but Chomsky had no idea what he was talking about; Chomsky’s naivety shown by the fact that he gave an interview to Hustler magazine a few years ago without realizing it was a total stroke book.

Academic superstars, like politicians, are in fact often sealed off by their amanuenses and gofers from knowledge. The real work of universities is done by adjuncts, bitter, twisted and prematurely aged graduate students, and full and associate profs who escape the limelight, especially Oprah. She has her sights on Dr. Lewin: one hopes he avoids her show.