JOHN MCCARTHY
The modern digital age began at the Massachusetts Institute of Technology (MIT) in the late 1950s, where a mathematician and engineer named John McCarthy offered the first undergraduate course in computer programming. He pondered ways to make the hulking mainframes of the day program in creative ways so they could act in creative ways, learn to adapt to their environments, be linked in complex networks, and evolve to become smarter on their own. To describe this dynamic vision of computing, he coined the term artificial intelligence (AI).
McCarthy was a legendary eccentric on a campus full of eccentrics. He had a habit of furiously pacing while thinking; if he was asked a question, he might just walk away without saying goodbye, only to reappear several days later with an answer as if the conversation had never been interrupted. If his colleagues wanted him to read a paper, instead of bringing it to his office where it would inevitably get lost, they would leave a copy on their own desk, McCarthy would eventually stroll in, pick it up, and march off to read it, usually without uttering a word. First encounters with McCarthy were unnerving: His greeting consisted of an expectant stare, No words at all. Discourse by his visitor brought a set of mumbles, which slowly increased in volume and clarity. Only when his mind reached the surface was something similar to normal conversation possible. His mind was like a vehicle streamlined for rapid passage through the fluid of thought, capable of maneuvering with little outside friction. But in the social terrain, his streamlined concentration became awkward and unwieldy.
Physically uncoordinated, he still took up mountain climbing, sailing, and piloting private planes. He would coach himself aloud through each step of a final approach – “prop feathered. mixture full rich .. . airspeed check .. . lay, now we’ll do this – only realize that he had already landed.
But his life’s work was never in doubt. When he was 8, he decided to become a scientist. His parents’ idealism infused his hope for computers as facilitators of democracy at a time when many left-wingers had a visceral distrust of technology. In high school, McCarthy taught himself calculus from college textbooks. At fifteen, he enrolled at the California Institute of Technology. There, he began thinking of designing machines that could simulate the human acquisition of Knowledge. McCarthy was instrumental in developing the concept of time-sharing, which allowed multiple users to gain access to centralized computing resources through a distributed network of terminals. He advocated installing a terminal in every home, convinced that someday it would be commonplace for people to use them to read instantly updated news, order books, buy plane tickets and reserve hotel rooms, edit documents remotely, and determine the efficacy of medical treatments.
McCarthy’s students devised the first program that enabled a computer to play chess well.
McCarthy’s most lasting contribution to his field was Lisp, a high-level programming language that enabled AI researchers to represent an unprecedented range of real-world events in their code. Unlike most programming languages of its vintage, it is still in wide use. McCarthy was ready for a change in the early 1960s when Stanford offered him a full professorship. McCarthy thrived in the hothouse of innovative ideas and technology that would soon be dubbed Silicon Valley, launching the famed Stanford Artificial Intelligence Laboratory (SAIL). By the early 1980s, he was already living in the future he foresaw a decade earlier. He could fetch his email, listen to the radio, revise and spell-check a paper on a remote server, play chess or Go, run searches on stories moving over the Associated Press wire, or fetch an up-to-date list of restaurant recommendations from programmers all over the world.
He certainly displayed many of the classical features of Asperger’s syndrome: his brusqueness, his single-minded focus to the point of being rude, his physical clumsiness, and his habit of coaching himself aloud when under stress. His many positive traits were a fascination with logic and complex machines, a gift for puns and aphorisms, an uncompromising personal ethic, and the ability to solve problems from angles that his more socially oriented colleagues missed. He was able to carve out a niche in an emerging field that was perfectly suited to his strengths while being tolerant – indeed, appreciative – of his many eccentricities.
His labs at MIT and Stanford were elaborate playgrounds for his extraordinary mind. They became magnets for other geniuses who were equally committed to the vision of a world empowered by access to computing – including Steve Jobs and Steve Wozniak, who went on to found Apple.
The culture of Silicon Valley began adapting to the presence of a high concentration of people with autistic traits even before the term Asperger’s syndrome was invented. Jean Hollands, a therapist wrote The Silicon Syndrome, about navigating “high-tech relationships” with a distinctive breed of intensely driven “sci-tech” men who loved to tinker with machines, were slow to pick up on emotional cues, had few if any close friends outside their professional circles, approached life in a rigorously logical and literal fashion and addressed problems in intimate relationships by “seeking data.” She received letters from the wives of engineers, coders, and math and physics professors all over the world. Autism was never mentioned, but 10 years later, could have been swapped and barely changed another word in the text.
Ultimately the future belonged not to the big mainframes and “dumb terminals” McCarthy loved but to the smart little machines being soldered together in garages. The task of claiming the power of computing for the many remained to be done by Internet pioneers like Vint Cerf and Tim Berners-Lee – and an autistic engineer who launched the first social network for the people in a record store in Berkeley.
LEE FELSENSTEIN
He had engineering in his blood. His grandfather, William T Price, made a fortune by shrinking the design of diesel engines so they could fit into trains and trucks. He was also on the autistic spectrum. In 3rd grade, Lee sketched exhaust pipes and compressors while coming up with schemes for redesigning automobiles to reduce air pollution. When a teacher accused him of daydreaming in class, he replied, “I’m not daydreaming, I’m inventing.” At 11, he set up a crystal radio set and first saw a computer at a science museum in Philadelphia and hung out near the machine all day. He then took a correspondence course in radio and TV repair with lessons on how to run your own business. He started repairing broken TVs in his basement and cannibalized glowing tubes and busted consoles for his experiments.
At the University of California at Berkeley, he became the geek-in-residence for war protests and free speech movements. At one demonstration when police were preparing for mass arrests, he was told to build a police radio. Instead of using Post-it notes to organize or leaflets to inform students of relevant issues, he realized that old broadcast methods could be better done by a decentralized, user-driven computer system.
He didn’t know that he was autistic and the psychiatric establishment didn’t acknowledge that people like him existed. His girlfriends complained that he didn’t respond appropriately in social situations and that he didn’t feel at home among people. By 1968, the stress of being an undiagnosed autistic in the middle of a cultural revolution had taken a heavy toll. After a crash into major depression, dropped out of university, started psychotherapy, and took a job as a junior engineer.
By reading manuals, he taught himself state-of-the-art programming punching holes in paper tape. There was no operating system and no software. At a conference, a researcher at Stanford named Doug Engelbart described ways to use computers, not to replace human intelligence, but to augment it. He set forth the fundamental elements of the modern digital age: graphical user interfaces, multiple window displays, mouse-driven navigation, word processing, hypertext linking, videoconferencing and real-time collaboration. These concepts – refined by Alan Kay and others at Xerox PARC – inspired Steve Jobs to build the Macintosh, the first personal computer (PC) designed for the mass market.
Felsenstein believed that computer networks could perform many of the functions of personal filing systems much faster and better, and they didn’t forget anything. He was also fascinated by the use of tools to facilitate conviviality – one of the many aspects of social interaction that he had always found difficult and confusing. With two fellow programmers, they needed an affordable computer sufficiently powerful to do the job. They wrangled the long-term lease of an SDS 940 (retail cost $300,000) from the Transamerica Corp – 24 feet long and required a fleet of air conditioners to stay cool.
For people who struggled to express themselves in face-to-face situations like Felsenstein, computer networks held the potential for not just “augmenting” communications but making it possible, period – minus the stuff that normally made the conversation so arduous, such as eye contact, body language, tone, and the necessity of making a good impression. The practical constraints of communicating online also require many aspects of social interaction that are normally implicit to be made explicit. Emoticons like :-), were like social captioning for people who have trouble parsing sarcasm and innuendo.
They created the first electronic bulletin board (called an “information flea market) in history, in 1973 (the first wide-open door to cyberspace) at the top of a staircase at Leopold’s Records on Telegraph Avenue in Berkeley. Information trickled across the Bay at a measly ten characters a second via an Oakland telephone exchange that could make a free all-day call to San Francisco. Everyone who ambled up the stairs became interested in using it – to find musicians for bands, exchange a myriad of items and services, show poems and writing, solicit rides, protest the war, gay liberation, the energy crisis, and sell things. Instead of being a community bulletin board, the network quickly became “a snapshot of the whole community”. However, without a sustainable economic model, they were unable to support the considerable cost of maintaining the SDS 940.
The smashing success of the project was gratifying for Felsenstein because a feeling of belonging to a community was precisely the thing that had always eluded him – even in the counterculture that was supposed to offer it to those who had never fit in anywhere else. “As a kid, I had a feeling that I was ensconced in some sort of alcove, behind a wall, and that the street was out there. I could see everyone else walking around engaging in life, but i couldn’t go out there. So what I was trying to do with the Community Memory was trying to expand the alcove.”
He moved on to other projects, including designing the Osborne I – the first truly portable personal computer, 3 years before the MacIntosh. However, he continued to struggle with depression and an inability to read other people’s intentions despite years of psychotherapy. Finally, in the 1990s, Felsenstein heard about Asperger’s syndrome and recognized not only himself but other members of his family. Reading about autism online, he came to think of his Asperger’s as more than just a set of deficits, but as his “edge” – the edge he inherited from his grandfather, which he has put to work in his career in technology for 40 years.
The text-based nature of online interaction eventually provided the foundation for something Leo Kanner couldn’t have imagined: the birth of the autistic community. But two things had to happen first. Kanner’s notion that autism was a rare form of childhood psychosis would have to be permanently laid to rest. Then they would have to overturn the notion that they were the victims of a global epidemic.