Quick notes on interface design history

Major interface innovations were informed by history and driven by one goal: intellect augmentation

Most of the significant advances in computer technology—including the work that my group did at Xerox PARC—were simply extrapolations of Lick’s vision. They were not really new visions of their own. So he was really the father of it all.

Robert Taylor

Director of Xerox PARC

Vannevar Bush and the Memex






As the Director of the U.S. Office of Scientific Research and Development (OSRD), Vannevar Bush oversaw nearly all military-related research and development in the United States during the Second World War. Having marshaled the forces of more than 6,000 scientists to develop technologies like radar and the atomic bomb that were crucial to the war effort, Bush set his mind to figuring out what all these scientists should do now that the war had ended. In July 1945, he published an article in the Atlantic called As We May Think, arguing that organizing scientific knowledge so that it can be of use is the primary challenge of our time. “Publication has been extended far beyond our present ability to make real use of the record,” creating the need for a “mechanized private file and library... in which an individual stores all his books, records, and communications, and which is mechanized so that it may be consulted with exceeding speed and flexibility. It is an enlarged intimate supplement to his memory.” To make use of this record, the mind should first be freed from repetitive tasks like performing standard calculations. “A mathematician,” Bush wrote, “is primarily an individual who is skilled in the use of symbolic logic on a high plane, and especially he is a man of intuitive judgment… All else he should be able to turn over to his mechanism... Some of [the mechanisms] will be sufficiently bizarre to suit the most fastidious connoisseur of the present artifacts of civilization.”


This speculative device, which Bush named the Memex, had specific and advanced interface characteristics. Some of these characteristics seem antiquated, others we might recognize in modern computers, and many are still unrealized. It took the form of a desk with three slanting translucent screens that “instantly bring files and material on any subject to the operator’s fingertips.” While sitting at the Memex desk, Bush imagined that it would be operated via buttons and levers, with a keyboard for direct input of information and notes in the margins. Because the Memex has multiple screens, items can be left in place and consulted alongside others. 


Someone who has access to all the world’s information, but lacks a way to make sense of it, is apt to “become bogged down part way there by overtaxing his limited memory.” The Memex’s “essential feature” therefore, is the ability for its operator to create named trails of information and ideas that can be reviewed alongside one another by “flipping pages” as though one had created a new book. These trails do not fade. Bush imagines a situation where someone was once investigating the history of a random topic for pleasure. “Several years later, his talk with a friend turns to the queer ways in which a people resist innovations, even of vital interest. He has an example, in the fact that the outraged Europeans still failed to adopt the Turkish bow. In fact he has a trail on it. A touch brings up the code book. Tapping a few keys projects the head of the trail. A lever runs through it at will, stopping at interesting items, going off on side excursions. It is an interesting trail, pertinent to the discussion. So he sets a reproducer in action, photographs the whole trail out, and passes it to his friend for insertion in his own Memex, there to be linked into the more general trail.” 


These information trails are often seen as an early example of hypertext, an idea later charismatically developed by Ted Nelson. Bush’s information trails are designed first to be marked by a Memex operator, who is able to build personal narratives and idea mazes that can be stored, consulted later, and possibly shared. Rather than facilitating easy, passive consumption of information, the Memex is a creative information management tool, encouraging active engagement with data and the creation of knowledge. In this landmark paper, Bush introduced ideas that would heavily influence a generation of research into information systems and human-computer interaction. One person who was influenced by Bush is JCR Licklider, who would go on, as the director of ARPA IPTO from 1962-1964 (Information Processing Techniques Office), to develop ideas that led to the funding and development of three pivotal advancements in information technology: the creation of computer science departments at several major universities, time-sharing, and networking.





A prescient vision of computer intelligence and human-computer symbiosis

In 1960, JCR Licklider published Man-Computer Symbiosis, which opens with a discussion of the relationship between fig trees and Blastophaga grossorum, the insects that live in the fig tree’s ovary, pollinating it in the process. Rather than imagining computers as tools to be used by humans, Licklider proposes a symbiotic relationship between people and machines, two very different kinds of intelligence living together in tight harmony.  The resulting partnership, he writes, “will think as no human brain has ever thought and process data in a way not approached by the information handling machines we know today.” Licklider looks even farther forward, to a “distant future” in which “cerebration is dominated by machines alone” but reassures the reader that “there will nevertheless be a fairly long interim during which the main intellectual advances will be made by men and computers working in intimate association.”


By understanding human and computer intelligences to be of two fundamentally different types whose mutual interaction will lead to a greater and ever-closer whole, Licklider anticipates the philosophies developed by contemporary scholars including UCSD’s Benjamin Bratton. Bratton contends that the Turing test is an intolerant measure of intelligence and that by forcing artificial intelligence to pass as human, we are committing the same grave error as the British when they forced Dr. Turing, a homosexual, to pass as a straight man.


In the following pages of Man-Computer Symbiosis, Licklider outlines the challenges that will have to be overcome to realize symbiosis. These include speed asymmetry between men and computers, memory hardware requirements, memory organization requirements, the language problem, and input and output equipment. Building on this work in 1962, he wrote a series of memos outlining what he called the Intergalactic Computer Network, described as “an electronic commons open to all, ‘the main and essential medium of informational interaction for governments, institutions, corporations, and individuals.” This vision led to ARPANET, the direct predecessor of the internet. 


Robert Taylor, founder of Xerox PARC's Computer Science Laboratory and Digital Equipment Corporation's Systems Research Center, noted that “most of the significant advances in computer technology—including the work that my group did at Xerox PARC—were simply extrapolations of Lick’s vision. They were not really new visions of their own. So he was really the father of it all.”



Augmenting human intellect by inventing the mouse




In the fall of 1945, a young radar technician named Douglas Engelbart was sitting in a hut at the edge of the jungle on Leyte, one of the Philippine islands, reading a copy of the Atlantic the local Red Cross chapter had let him borrow. He would later write in a letter to Vannevar Bush that the ideas he encountered in that article had “influenced him quite basically”. Upon returning home, a newly engaged Engelbart had something of a quarter-life crisis, realizing it was about time to figure out what he was going to do professionally. He decided to maximize how much good he could do for humanity as the primary goal, which he would characterize as a “crusade.” As he set about trying to figure out what crusade to get on, he says the answer came to him in a flash: “The complexity of a lot of problems and the means for solving them are just getting to be too much. The time available for solving a lot of the problems is getting shorter and shorter… The complexity/urgency factor had transcended what humans can cope with. I suddenly flashed that if you could do something basic to improve human capability to cope with that, then you’d really contribute something basic.”


With this in mind, Engelbart embarked upon his crusade. He decided to go back to graduate school at Berkeley, where he tried to share his interest in symbolic logic and the possibility of using computers to structure information (instead of just doing numeric computations) with the computer scientists. But, he recalls, most of them were just not interested. While his PhD would be in computer science, Engelbart took courses in the logic and philosophy departments, and recalled that he “didn’t particularly travel within circles of engineering students,” preferring instead to hang out with the humanities majors even though they didn’t really understand what he was trying to do either. 


Engelbart, an engineer himself, felt that “more engineering was not the dominant need of the world at that time.” As a result, he was always a bit of an outsider. But if what he was really trying to do wasn’t computer engineering, what was it? Using information technology to augment human intellect is a worthy goal, but it is certainly not part of the discipline of computer science and is hardly a discipline itself. In a 1961 letter to Dr. Morris Rubinoff of the National Joint computer committee, he gives a hint: “The impact of computer technology is going to be more spectacular and socially significant than any of us can comprehend. I feel that comprehension can only be attained by considering the entire socio-economic power structure, a task which the people in the know about computer technology aren’t equipped for, and a job about which the people who might be equipped properly are not yet stimulated or alerted… In an instance where something looms on the horizon as imposingly as does computer technology, we should be organizing scouting parties composed of nimble representatives from different tribes - e.g. sociology, anthropology, psychology, history, linguistics, philosophy, engineering - and we shall have to adapt to continual change.” Engelbart seems to be saying that there’s a job to be done that involves designing and prototyping the character of human-computer interaction to ensure that its effects on society are desirable. 


At the Stanford Research Institute, Engelbart set up a lab called the Augmentation Research Center (ARC) where he set about doing this job. First, he built a conceptual framework to “orient us toward the real possibilities and problems associated with using modern technology.” He did this by examining how people currently go about managing what he called the complexity/urgency factor, assuming that carefully observing the problems that people have - what designers now call pain points - would suggest ways that technology might be able to help them solve those problems more effectively. This reveals the areas where research will be possible.


In a 1962 paper titled Augmenting Human Intellect, Engelbart outlined the results of his preliminary research, starting with the observation that human ability to manipulate the world depends on four basic augmentation means: artifacts, language, methodology, and training. It is through what he refers to as the H-LAM/T system (Human using Language, Artifacts, Methodology, in which he is Trained) that people manipulate concepts and symbols, and where the opportunity to augment human intellect lies. The paper also outlined a research method called ‘bootstrapping’ in which the the ARC team would use the equipment they were building, and in the process test it to make sure it would work. A target demographic was identified - knowledge workers - with a specific and deeply considered emphasis on computer programmers as the initial users of the system. If you are reading this, let us know. Numerous scenarios are written imagining how a particular person in a specific situation might use the system. What is the discipline that Engelbart was searching for? A combination of the arts and sciences, or an approach to computer science that uses the humanities as a starting point to move into prototyping technologies that help people work effectively with computers to make the world a better place? A close reading of Augmenting Human Intellect reveals SRI’s ARC as the first Interaction Design lab. 


This paper would serve as a framework for Engelbart and his team of about 50 people to design and build a system for augmenting human intellect. The system, called NLS (or oN-Line System) was revolutionary. Its 1968 demonstration at Brooks Hall underneath San Francisco’s Civic Center is known as the Mother of All Demos, for good reason. Here, Engelbart introduced for the first time almost all the fundamental elements of modern personal computing: windows, hypertext, graphics, efficient navigation and command input, video conferencing, the computer mouse, word processing, dynamic file linking, revision control, and a real-time editor for digital collaboration across distances. While NLS laid the groundwork for today’s GUI interfaces, it differed in many ways from the systems we are now accustomed to. In addition to the mouse and keyboard, the NLS interface also used a chorded keyset that worked in conjunction with the other input elements to provide a powerful way to, in Engelbart’s words, “fly through the interface.” 


NLS built upon Bush’s sophisticated understanding of hypertext to create a highly customizable and collaborative linking, annotation, and editing process that was designed to augment and shape the cognitive processes of the people using the system, organizing enormous datasets into a coherent, understandable whole. Engelbart writes “symbols with which the human represents the concepts he is manipulating can be arranged before his eyes, moved, stored, recalled, operated upon according to extremely complex rules - all in very rapid response to a minimum amount of information supplied by the human, by means of special cooperative technological devices.” This was referred to as “coevolution” - a process by which humans and computers evolved, together, their abilities to work with information efficiently and effectively. 



From a bicycle to a tricycle: computers for people who've never seen one before






NLS was designed as a powerful information working tool to augment human intellect. Like learning to drive a car or operate other kinds of sophisticated and powerful equipment, it would take ten or fifteen hours to learn how to use it. But in the late sixties and early seventies, it was not at all clear that such tools were necessary to interface with computers in the basic use cases that existed at the time. Some people at Engelbart’s lab and elsewhere began to feel that the complexity of Engelbart’s interface was actually serving as a barrier to entry for people interested in interfacing with computers, but there wasn’t another comparable system to test this hypothesis. Enter Xerox PARC - the second major Interaction Design lab.


In the words of Alan Kay, a key figure at PARC, “Engelbart, for better or worse, was trying to build a violin. Most people don’t want to learn the violin.” Alan Kay and others at PARC, including its director Robert Taylor, also approached computing technology from a place of wonder at the possibilities of working with information. They were serious thinkers. Kay’s PhD thesis involved prototyping how a GUI might enable people to navigate through what he called ‘Ideaspace’ and incorporated inspiration from WH Auden, JS Bach, and Kahlil Gibran - “You would touch with your fingers the naked body of your dreams.” But the researchers at PARC set out to build a GUI that was simple and easy to use, rejecting Engelbart’s emphasis on coevolutionary learning. Furthermore, they rejected Engelbart’s network vision in which multiple screens were hooked in to a single server on which people worked collaboratively - essentially networked computing or cloud computing - in favor of a very new idea, the personal computer.


In 1973, PARC introduced the Alto - the first personal computer designed to support a GUI operating system. The Alto also pioneered what we now know as the WIMP interface, which stands for Windows, Icons, Mouse, and Pointer. It turned out to be phenomenally intuitive and understandable. Steve Jobs, co-founder of Apple Inc., visited PARC in December 1979 and was astonished to see that Xerox hadn’t brought it to market. So he poached some of the researchers at PARC, notably Larry Tesler, and brought it to market himself. Apple’s Lisa, released in January of 1983, was the first mass-produced personal computer with a WIMP GUI - copied from the Alto. It was far too expensive and a massive commercial failure. In 1984, it was followed by the Macintosh.


In the years since the introduction of the personal computer, the Alto-derived WIMP interface has provided the foundation upon which computing has grown into a megastructure of planetary scale computation. Information processing hardware has become cheap, highly miniaturized, and fast. The net has become vast and infinite. New software programs designed to leverage that sophistication in innovative ways continue to be invented. Progress towards new interface contexts for people to interact with those advanced software programs, and to facilitate the programs’ interaction with one another, has been very limited. Enai is a determined effort to change this.

©

2024 Enai Corporation. All Rights Reserved.

©

2024 Enai Corporation. All Rights Reserved.

©

2024 Enai Corporation. All Rights Reserved.