Automatic Pinion Cutter, Used by the Waltham Watch Company, circa 1892 / THF110250
The roles women play in manufacturing are occasionally highlighted, but are often hidden—opposing states that these two stories from our collections demonstrate.
The Waltham Watch Company in Massachusetts was a world-famous example of a highly mechanized manufacturer of quality consumer goods. Specialized labor, new machines, and interchangeable parts combined to produce the company's low-cost, high-grade watches. Waltham mechanics first invented machines to cut pinions (small gears used in watch movements) in the 1860s; the improved version above, on exhibit in Made in America in Henry Ford Museum of American Innovation, was developed in the 1890s.
This article, “The American Watch Works,” from the July-December 1884 issue of Scientific American, discussed the women workers of the Waltham Watch Company. / THF286663
In the late 19th century, reports on the world-renowned company featured women workers. An 1884 Scientific American article specifically called out women’s work. The article explained that, “For certain kinds of work female operatives are preferred, on account of their greater delicacy and rapidity of manipulation.” Recognizing that gendered experiences—activities that required manual dexterity, such as sewing, or the exacting work of textile production—had prepared women for a range of delicate watchmaking operations, the Waltham company hired them to drill, punch, polish, and finish small watch parts, often using machines like the pinion cutter above. The company publicized equal pay and benefits for all its employees, but women workers were still segregated in many factory facilities and treated differently in the surrounding community.
The same reasoning that guided women’s work at Waltham in the 19th century led 20th-century manufacturers to call on women to produce an early form of computer memory called core memory. Workers skillfully strung tiny rings of magnetic material on a wire grid under the lens of a microscope to create planes of core memory, like the one shown above from the Burroughs Corporation. (You can learn more about core memory weaving here, and more about the Burroughs Corporation here.) These woven planeswould be stacked together in a grid structure to form the main memory of a computer.
However, unlike the women of Waltham, the stories of most core memory weavers—and other women like them in the manufacturing world—are still waiting to be told.
This post was adapted from a stop on our forthcoming “Hidden Stories of Manufacturing” tour of Henry Ford Museum of American Innovation in the THF Connect app, written by Saige Jedele, Associate Curator, Digital Content, at The Henry Ford. To learn more about or download the THF Connect app, click here.
On August 12, 1981, as members of the press gathered in the Waldorf-Astoria ballroom in New York City, one of the largest technology companies in the world was about to make an announcement. At the time, the name “IBM” was mostly associated with the room-sized installations of mainframe computers that the company had become famous for in the 1950s. They cost millions of dollars to purchase, needed their own air-conditioned rooms, and required specially trained staff. They were found in large corporations, universities, and research facilities—but not in a typical home. That was about to change with the introduction of the IBM Model 5150, also known as the IBM PC.
The idea of internally producing a small, affordable computer was at odds with IBM’s corporate culture. One naysayer remarked that “IBM bringing out a personal computer would be like teaching an elephant to tap dance." Nonetheless, a development team was formed, and the lofty goal of completing the project in one short year was established. “Project Chess” began its race toward the finish line. The team of twelve was fronted by Don Estridge and Mark Dean, who designed the ISA bus (an interface allowing easy expansion of memory and peripherals) and color graphics system.
Part of the success story of designing the 5150 in such a short span of time is an exception to a long-standing IBM company rule: the engineers were allowed to include technology made by outside companies, rather than building every aspect of the PC, from the ground up, themselves. This is why the IBM PC uses an Intel 8088 microprocessor, can run on Microsoft DOS, and is compatible with software made by other companies. It was also released under an open architecture model—a philosophy that would soon lead to a flood of PC-inspired “clones.”
An Atari 800 computer: an early attempt by a video game company to harness the home computing market. / THF155976
In truth, the IBM PC was not the first small home computer, and by entering this market, the company would face competition from Commodore, Atari, Tandy, and Apple—all of whom had produced successful microcomputers beginning in the mid-1970s. To match the wide reach of these rivals, IBM sold their machines at convenient retailers like Sears and ComputerLand. Importantly, it was affordable by 1981 standards at an introductory price of $1,565. And… it fit on your desktop.
A positive effect of IBM creating a PC is that it helped to legitimize the notion of home computers beyond specialists and the home hobbyist crowd. IBM was essentially a well-recognized “heritage brand” by 1981, so the type of consumers reluctant to invest in a computer produced by a scrappy start-up were suddenly scrambling to put deposits down for a 5150. Whereas as “young” computing companies (many of which started out as video game companies) were under threat of being swallowed up in a competitive market, IBM projected an aura of measured reliability and was trusted to stick around.
Ironically, while IBM’s plan was to break out of the office and into the home, PCs were purchased in bulk by businesses to populate desks and cubicles. A visual unity was established in office environment—fields of putty gray and beige personal computers.
The IBM 5150 arrived at an important “boom” moment in computing history. It is evidence of an established company challenging its established design modes by harnessing emerging technologies. And IBM’s decision to pivot proved to be a timely decision too, since affordable microprocessors began to render behemoth, expensive mainframes largely obsolete. But most importantly, the IBM PC—and the wave of computers like it that followed—were designed with the non-specialist in mind, helping to make the personal computer an everyday device in people’s homes.
The growth of commercial aviation in the United States presented a challenge—how could airports control aircraft within the increasingly crowded space around them? The earliest efforts at air traffic control were limited to ground crew personnel waving flags or flares to direct planes through takeoffs and landings. Needless to say, this system needed improvement.
The first air traffic control tower opened in 1930 at Cleveland Municipal Airport. Pilots radioed their positions to the tower, where controllers noted the information on a map showing the positions of all planes within the airport's vicinity. Controllers radioed the pilots if a collision seemed possible and gave them permission to land or take off. Soon, all large American airports employed towers operated by the airports' respective municipal governments and staffed by growing crews. Smaller airports, though, remained dependent on a single controller (who might also handle everything from the telephone switchboard to passenger luggage). Additionally, some pilots treated controllers' instructions as mere suggestions—the pilots would land when and where they pleased.
Before air traffic controllers began communicating with pilots by radio, airports relied on ground crew personnel to direct planes through takeoffs and landings. / detail of THF94919
Airlines recognized the need for formal oversight and attempted to supply it themselves. They formed Air Traffic Control, Inc., in 1936 to regulate traffic at larger airports. This new agency worked well but applied only to commercial aircraft. It became clear that only federal supervision could regulate all commercial and private air traffic at the nation's airports. The Civil Aeronautics Act, passed by Congress in 1938, established the Civil Aeronautics Authority—the forerunner of today's Federal Aviation Administration (FAA)—to establish safety guidelines, investigate accidents, regulate airline economics, and control air traffic.
The post-World War II economic boom brought a surge in air travel, as well as larger and faster jet aircraft. But the nation's air traffic control system remained unchanged. Upgrades came only after a tragic mid-air collision between two passenger planes over the Grand Canyon in 1956. All 128 passengers and crew aboard both flights perished. Public outrage forced the widespread implementation of radar, a technology greatly improved during the war, into the management of U.S. skies.
Into the 1960s, air traffic controllers augmented radar signal displays with hand-written plastic markers that identified each plane and its altitude. Integrating computers with radar eliminated the need for written markers, as information about each plane automatically displayed on radar screens. This improved radar system, referred to as the Automated Radar Terminal System, finally made its way to metropolitan airports in 1969, when the FAA contracted with Sperry Rand to build control computers and radar scopes.
This computer-integrated radar scope, used at Detroit Metro Airport from 1970 to 2001, was one of the first units capable of displaying an airplane's identification number and altitude directly on the screen. In this photograph, panels have been removed to reveal the unit’s internal components. / THF154729
This radar scope display panel is the first of those scopes to be produced. It was installed at Detroit Metropolitan Airport in 1970. This unit, and others like it, sat in the tower's radar room. It was used to monitor and control aircraft within 35 miles of the airport. Two people worked the unit in tandem, sitting on either side of the display screen. While this arrangement made maximum use of expensive equipment, it led to inevitable difficulties—users sometimes disagreed on screen contrast settings. With the introduction of single-user LCD displays in the 1980s and 1990s, this unit was downgraded to training use and then retired from service in 2001.
Today, radar itself is facing retirement from air traffic control. Aircraft can relay their positions to each other and the ground without radar through Automatic Dependent Surveillance-Broadcast, which combines GPS technology with high-speed data transfer. Required in most controlled airspace as of January 1, 2020, this new system provides more accurate location information. It also allows closer spacing of aircraft in the skies, increasing capacity and permitting better traffic management.
Though it was outpaced by newer technologies, this computer-integrated radar scope—the first of its kind—survives in the collections of The Henry Ford as evidence of the critical developments that produced the safe and efficient aviation system we rely on today. To discover more aviation stories, visit the Heroes of the Sky exhibition in Henry Ford Museum of American Innovation, or find more on our blog.
Matt Anderson is Curator of Transportation at The Henry Ford.
We all know that 2020 was quite the year—there was a worldwide pandemic, protests across the United States, and a contentious presidential election. It’s understandable that during the year, we all had a lot on our minds.
That said, we shared more than 160 new posts on our blog during 2020. Most of these were eagerly found and devoured by our readers. But a few really great stories from our collections might have gotten lost in the shuffle—and we wanted to make sure you didn’t miss them. Here are ten of those hidden gems to help you start off 2021 right.
The Jazz Bowl: Emblem of a City, Icon of an Age. Discover how a 24-year-old ceramic artist, Viktor Schreckengost, designed a bowl that both captured the essence of New York City in the early 20th century and became an icon of America’s “Jazz Age.”
A LINC console built by Jerry Cox at the Central Institute for the Deaf, 1964.
New Acquisition: LINC Computer Console. The LINC computer may not be as familiar to you as the Apple 1, but it is in contention for the much-debated title of “the first personal computer.” Learn more about its history and the people involved in its creation.
Immerse Yourself in Pop Culture
Lady and the Tramp Charm Bracelet, circa 1955 / THF8604
Lady and the Tramp Celebrates 65 Years. Take a new look at an old classic—Disney’s 1955 movie Lady and the Tramp. Learn how it came to be and share in some personal memories from one of our curators.
Crosley Reado Radio Printer, 1938-1940 / THF160315
Experiments with Radio Facsimile at W8XWJ. Learn about the “Press-Radio War” of the 1930s, and a revolutionary, but ultimately short-lived, experiment by Detroit News radio station W8XWJ to deliver print-at-home news.
A More Colorful World. Discover how a chemistry student, seeking to create a synthetic cure for malaria, inadvertently created the first synthetic dye, aniline purple—and then created more, transforming the world’s access to color.
Ellice Engdahl is Digital Collections & Content Manager at The Henry Ford.
The auditorium at the 1968 Fall Joint Computer Conference before guests arrive. / THF610598
The setting is sparse. The downward sweep of theatre curtains, a man seated stage left, backed by a hinged office cubicle wall. Technology in this image is scarce, and yet it defines the moment. A video camera is perched on top of the wall, its electronic eye turned downwards to surveil a man named Douglas Engelbart, seated in a modified Herman Miller Eames Shell Chair below. A large projection screen shows a molded tray table holding a keyboard at its center, a chunky-looking computer mouse made of wood on the right side, and a “chording keyboard” on the left. Today, we take the computer mouse for granted, but in this moment, it was a prototype for the future.
The empty auditorium chairs in this image will soon be filled with attendees of a computer conference. It is easy to imagine the collective groan of theater seating as this soon-to-arrive audience leans a little closer, to understand a little better. With the click of a shutter from the back of the room, this moment was collapsed down into the camera lens of a young Herman Miller designer named Jack Kelley. He knew this moment was worth documenting because if the computer mouse under Douglas Engelbart’s right hand onstage was soon going to create “the click that was heard around the world,” this scene was the rehearsal for that moment.
Entrance to the 1968 Fall Joint Computer Conference, San Francisco Civic Auditorium. / THF610636
“The Mother of All Demos”
On December 9, 1968, Douglas Engelbart of the Stanford Research Institute (SRI) hosted a session at the Joint Computer Conference at the Civic Center Auditorium in San Francisco. The system presented—known as the oNLine System (or NLS)—was focused on user-friendly interaction and digital collaboration.
Douglas Engelbart demonstrates the oNLine System. / THF146594
In a span of 90 minutes, Engelbart (wearing a headset like the radar technician he once was) used the first mouse to sweep through a demonstration that became the blueprint for modern computing. For the first time, computing processes we take for granted today were presented as an integrated system: easy navigation using a mouse, “WYSIWYG” word processing, resizable windows, linkable hypertext, graphics, collaborative software, videoconferencing, and presentation software similar to PowerPoint. Over time, the event gained the honorific “The Mother of all Demos.” When Engelbart was finished with his demonstration, everyone in the audience gave him a standing ovation.
Fixing the Human-Hardware Gap
In 1957, Engelbart established the Augmentation Research Center (ARC) at SRI to study the relationship between humans and machines. It was here, in 1963, that work on the first computer mouse began. The mouse was conceptualized by Engelbart and realized from an engineering standpoint by Bill English. All the while, work on NLS was percolating in the background.
Douglas Engelbart kicks back with the NLS at the Stanford Research Institute (SRI). / THF610612
While Engelbart was gearing up to present the NLS, Herman Miller Research Corporation’s (HRMC’s) president and lead designer Robert Propst was updating the “Action Office” furniture system. Designed to optimize human performance and workplace collaboration, Action Office caught Engelbart’s attention. He was excited by its flexibility and decided to consult with Herman Miller to provide the ideal environment for people using the NLS. Propst sent a young HMRC designer named Jack Kelley to California so he could study the needs of the SRI group in person.
Jack Kelley and Douglas Engelbart testing Herman Miller’s custom Action Office setup at Stanford Research Institute. / THF610616
After observing and responding to the needs of the team, Kelley recommended a range of customized Action Office items, which appeared onstage with Engelbart at the Joint Computer Conference. One of the items that Kelley designed was the console chair from which Engelbart gave his lecture. He ingeniously paired an off-the-shelf Shell Chair designed by Charles and Ray Eames with a molded tray attachment to support the mouse and keyboard. This one-of-a-kind chair featured prominently in The Mother of All Demos.
An unobstructed view of Jack Kelley’s customization of an Eames Shell Chair with removable, swinging tray for the NLS. The chording keyboard is visible at left, and the prototype mouse is at right. / THF610615
During the consultation, Kelley also noticed that Engelbart’s mouse prototype had difficulty tracking on hard surfaces. He created a “friendly” surface solution by simply lining the right side of the console tray with a piece of Naugahyde. If Engelbart was seen to be controlling the world’s first mouse onstage in 1968, Kelley contributed one very hidden “first” in story of computing history too: the world’s first mousepad. Sadly, the one-of-a-kind chair disappeared over time, but luckily, we have many images documenting its design within The Henry Ford’s archival collections.
A closer view of the world’s first mousepad – the beige square of Naugahyde inset into the NLS tray at bottom right. / THF610645
The computer scientist Mark Weiser said, “the most profound technologies are the ones that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it.” If this is true, the impact of Engelbart’s 1968 demonstration—supported by Kelley’s console chair and mousepad—are hidden pieces of the computing history. So as design shaped the computer, the computer also shaped design.
Kristen Gallerneaux is Curator of Communications & Information Technology at The Henry Ford.
Throughout its history, the Burroughs Corporation adhered to the founding principles of William S. Burroughs – to respond to the human problems of the times with relevant technologies. As part of the William Davidson Foundation Initiative for Entrepreneurship, we had the opportunity to delve into the Burroughs Corporation Collection, which consists of machinery, photographs, publications, and marketing materials for the business equipment that Burroughs manufactured.
William Seward Burroughs – grandfather to the Beat Generation author sharing the same name – was a banker from Auburn, New York. He was also an inventor with an aptitude for mechanical design. Burroughs suffered from tuberculosis and moved his family to St. Louis, Missouri, in 1882 on the suggestion of his doctor, who thought the warmer climate would be better for his health. While there, Burroughs rented bench space from a local machine-shop owner, Joseph Boyer, and began designs on a machine that could ease the work of figuring and re-figuring mathematical calculation by hand – work that proved tedious for bankers and shopkeepers alike. In 1886, with a working machine complete, Burroughs formed the American Arithmometer Company with co-founders Thomas Metcalfe, RM Scruggs, and William R. Pye, to produce and market his machine.
The company’s first device was a simple addition and subtraction machine. Unfortunately, the machines didn’t work as well as planned. It was quickly discovered that accurate calculations required a specific amount of pressure to be applied to the handle. This was an unforeseen mechanical flaw that produced inaccurate calculations and caused bankers to lose faith in the machine, nearly causing the fledgling company’s failure. Burroughs was incredibly disappointed. In fact, he was in the process of quite literally throwing the machines out the window of his second-story workroom when he had the idea to use a dash-pot. A dash-pot is a mechanical device which resists motion – for instance, preventing heavy doors from slamming. This provided a uniform motion for the handle regardless of the force exerted upon it, regulating the mechanism.
With the handle problem solved, bankers renewed their trust in the machines and bought them with enthusiasm. In the first decade, the company grew in staff and sales, increasing their product line to four models by 1898. Unfortunately, William S. Burroughs died the same year, but his company was left in good hands. Under President Joseph Boyer, the company experienced significant growth. By 1904, the company had outgrown its St. Louis facility, moving operations to Detroit, Michigan, where a 70,000-square foot factory was built. In 1905, the company was renamed the Burroughs Adding Machine Company as a tribute to its late founder.
In the 1920s, the company continued to expand its operations, establishing worldwide sales in 60 countries and production in South America, Europe, Africa, and Australia. In the mid-1930s, recognizing the potential for additional advanced equipment, the company’s product line diversified to include over 450 models of manual and electric calculation devices, bookkeeping machines, and typewriters.
During World War II, Burroughs’ production was halted as the company collaborated with the National Defense Program to enter into military and war contracts. Its most influential contribution to the war effort was the development of the Norden bombsight in 1942. According to the Burroughs’ “History” booklet, this apparatus made “accurate, high-altitude bombing possible, and was considered by some military authorities as the single most significant device in shortening the war.” This same bombsight was used on the Enola Gay to accurately drop the atomic bomb “Little Boy” on Hiroshima, Japan, in 1945.
Burroughs’ work throughout the war launched the company onto a different trajectory once military production was no longer required. Wartime needs had accelerated computer and electronics research, becoming a significant part of the company’s focus in the 1950s, along with defense, space research, banking, and business technology. In 1952, Burroughs built the core memory system for the ENIAC – the world’s first electronic general-purpose computer.
The 1950s were a time for diversification for Burroughs as the company acquired many other entities in order to expand its product capabilities. In 1953, to reflect its increasingly diverse product and service offerings, the company was renamed the Burroughs Corporation, and was recognized as a single outlet for a variety of business management products. One of the most significant acquisitions came in 1956, when Burroughs acquired ElectroData Corporation of Pasadena, California. This allowed Burroughs to further expand into the electronic computing market and led to the development of the B5000 series in 1961, which was celebrated as a groundbreaking scientific and business computer.
Successful collaboration during wartime prompted Burroughs Corporation to be awarded additional government and defense contracts throughout the 1960s. The company provided electronic computing solutions in the Navy’s POLARIS program, the Air Force’s SAGE, ALRI, ATLAS, and BUIC air defense networks, and the NORAD combat computing and data display system. According to the Burroughs’ “History” booklet, during the Cold War Burroughs computers were being “used to make split-second evaluations of threats to the North American continent using input from satellites and radar throughout the world.”
Burroughs also produced a transistorized guidance computer in 1957, which was used in the launch of Atlas intercontinental ballistic missiles (ICBMs) – this same system was deployed in the 1960s to launch Mercury and Gemini space flights.
By the 1970s, Burroughs had emerged as a major player in the computer industry, but was still in the shadow of powerhouses like IBM. To further its influence and market potential, the company began thinking about office automation and information management in a holistic way, providing all scales of computers from mini- and micro-computers to networks and large modular systems – along with the software and peripherals (printers, communications systems, displays, and keyboards) to complement them.
Throughout the early 1980s, additional acquisitions were achieved in order to fill technology voids and strengthen areas targeted for future growth. The company also developed joint ventures to strengthen business relationships. Despite this growth, IBM continued to dominate the market as the unrivaled leader of the computer industry. Hoping to challenge IBM, Burroughs embarked on a substantial entrepreneurial undertaking with Sperry Corporation in 1986. Combining the market positions, talent, and resources of both corporations, the merger was meant to signal a new era of competition. What resulted was one of the largest mergers ever to occur in the computer industry, and the creation of the new entity in information technology, Unisys.
From the adding machine to office equipment to computers that helped to send people into space, the Burroughs Corporation was steadfast in its pursuit of the latest research and in its development of cutting-edge technology. To view additional items we’ve already digitized from our Burroughs Corporation Collection, check out our Digital Collections page!
Samantha Johnson is Project Curator for the William Davidson Foundation Initiative for Entrepreneurship at The Henry Ford. Special thanks to Kristen Gallerneaux, Curator of Communications & Information Technology, for sharing her knowledge and resources to assist in the writing of this post.
A LINC console built by Jerry Cox at the Central Institute for the Deaf, 1964.
There are many opinions about which device should be awarded the title of "the first personal computer." Contenders range from the well-known to the relatively obscure: the Kenbak-1 (1971), Micral N (1973), Xerox Alto (1973), Altair 8800 (1974), Apple 1 (1976), and a few other rarities that failed to reach market saturation. The "Laboratory INstrument Computer" (aka the LINC) is also counted among this group of "firsts." Two original examples of the main console for the LINC are now part of The Henry Ford's collection of computing history.
The LINC is an early transistorized computer designed for use in medical and scientific laboratories, created in the early-1960s at the MIT Lincoln Laboratory by Wesley A. Clark with Charles Molnar. It was one of the first machines that made it possible for individual researchers to sit in front of a computer in their own lab with a keyboard and screen in front of them. Researchers could directly program and receive instant visual feedback without the need to deal with punch cards or massive timeshare systems.
These features of the LINC certainly make a case for its illustrious position in the annals of personal computing history. For a computer to be considered "personal," the device must have had a keyboard, monitor, data storage, and ports for peripherals. The computer had to be a stand-alone device, and above all, it had to be intended for use by individuals, rather than the large "timeshare" systems often found in universities and large corporations.
The inside of a LINC console, showing a network of hand-wired and assembled components.
Prototyping In 1961, Clark disappeared from the Lincoln Lab for three weeks and returned with a LINC prototype to show his managers. His ideal vision for the machine was centered on user friendliness. Clark wanted his machine to cost less than $25,000, which was the threshold a typical lab director could spend without needing higher approval. Unfortunately, Clark’s budget goal wasn’t reached—when commercially released in 1964, each full unit cost $43,000 dollars.
The first twelve LINCs were assembled in the summer of 1963 and placed in biomedical research labs across the country as part of a National Institute of Health-sponsored evaluation program. The future owners of the machines—known as the LINC Evaluation Program—travelled to MIT to take part in a one-month intensive training workshop where they would learn to build and maintain the computer themselves.
Once home, the flagship group of scientists, biologists, and medical researchers used this new technology to do things like interpret real-time data from EEG tests, measure nervous system signals and blood flow in the brain, and to collect date from acoustic tests. Experiments with early medical chatbots and medical analysis also happened on the LINC.
In 1964, a computer scientist named Jerry Cox arranged for the core LINC team to move from MIT to his newly formed Biomedical Computing Laboratory at Washington University at St. Louis. The two devices in The Henry Ford's recent acquisition were built in 1963 by Cox himself while he was working at the Central Institute for the Deaf. Cox was part of the original LINC Evaluation Board and received the "spare parts" leftover from the summer workshop directly from Wesley Clark.
Mary Allen Wilkes and her LINC "home computer." In addition to the main console, the LINC’s modular options included dual tape drives, an expanded register display, and an oscilloscope interface. Image courtesy of Rex B. Wilkes.
Mary Allen Wilkes Mary Allen Wilkes made important contributions to the operating system for the LINC. After graduating from Wellesley College in 1960, Wilkes showed up at MIT to inquire about jobs and walked away with a position as a computer programmer. She translated her interest in “symbolic logic” philosophy into computer-based logic. Wilkes was assigned to the LINC project during its prototype phase and created the computer's Assembly Program. This allowed people to do things like create computer-aided medical analyses and design medical chatbots. In 1964, when the LINC project moved from MIT to the Washington University in St. Louis, rather than relocate, Wilkes chose to finish her programming on a LINC that she took home to her parent’s living room in Baltimore. Technically, you could say Wilkes was one of the first people in the world to have a personal computer in her own home.
Wesley Clark (left) and Bob Arnzen (right) with the "TOWTMTEWP" computer, circa 1972.
Wesley Clark Wesley Clark's contributions to the history of computingbegan much earlier, in 1952, when he launched his career at the MIT Lincoln Laboratory. There, he worked as part of the Project Whirlwind team—the first real time digital computer, created as a flight simulator for the US Navy. At the Lincoln Lab, he also helped create the first fully transistorized computer, the TX-0, and was chief architect for the TX-2.
Throughout his career, Clark demonstrated an interest in helping to advance the interface capabilities between human and machine, while also dabbling in early artificial intelligence. In 2017, The Henry Ford acquired another one of Clark's inventions called "The Only Working Turing Machine There Ever Was Probably" (aka the "TOWTMTEWP")—a delightfully quirky machine that was meant to demonstrate basic computing theory for Clark's students.
Whether it was the “actual first” or not, it is undeniable that the LINC represents a philosophical shift as one of the world’s first “user friendly” interactive minicomputers with consolidated interfaces that took up a small footprint. Addressing the “first” argument, Clark once said: "What excited us was not just the idea of a personal computer. It was the promise of a new departure from what everyone else seemed to think computers were all about."
Kristen Gallerneaux is Curator of Communication & Information Technology at The Henry Ford.
It’s 1984. Turn on your Macintosh computer. Marvel at the convenience of the mouse under your hand. Point the arrow on your screen towards a desktop folder and click to open a file. Drag it and drop it somewhere else. Or, open some software. How about MacPaint? Select the pencil, draw some craggy lines; use the spilling paint bucket to fill in a shape. Move your arrow to the floppy disk to save your work. And then… imagine a worst-case scenario, as the ticking wristwatch times out. A pixelated cartoon bomb with a lit fuse appears. Your system crashes. The “sad Mac” appears.
Introducing the Icon Computer icons are visual prompts that when clicked on, launch programs and files, trigger actions, or indicate a process in motion. Clicking an icon is a simple gesture that we take for granted. In our current screen-based culture—spread between computers and smartphones—we might absent-mindedly use these navigational shortcuts hundreds (if not thousands) of times a day.
Before the mid-1980s, after booting up their computers, people typically found themselves greeted by a command line prompt floating in a black void, waiting for direction. That blinking cursor could seem intimidating for new home computer users because it assumed you knew the answers—that you had memorized the machine’s coded language. The GUI (graphical user interface, pronounced “gooey”) changed how humans interacted with computers by creating a virtual space filled with clickable graphical icons. This user-centric form of interaction, known as “the desktop metaphor,” continues to dominate how we use computers today.
The 1984 Apple Macintosh was not the first computer to use a GUI environment or icons. That achievement belongs to the 1973 Xerox Alto—a tremendously expensive, vertically-screened system that only sold a few hundred units. After a few failed attempts, the multi-tasking GUI system finally found a foothold in the home computing market with the introduction of “the computer for the rest of us”—the Macintosh.
From Graph Paper to Screen Pixels After completing her PhD in Art History, Susan Kare briefly entered the curatorial sphere before realizing that she would rather dedicate her career to the production of her own creative work. In 1982, Andy Hertzfeld, a friend of Kare’s from high school, called with an interesting opportunity: join Apple Computer’s software group and help design the user experience for the then-developing Macintosh computer.
Kare took up Hertzfeld’s offer and set to work designing the original Macintosh icons, among them the trash can, the file folder, the save disk, the printer, the cloverleaf command (even today, this symbol appears on Apple keyboards), and the mysterious “Clarus the Dogcow.”
Since no illustration software existed yet, Kare designed the first Macintosh icons and digital fonts through completely analog means. Using a graph paper notebook, she filled in the squares with pencil and felt-tipped pens, coloring inside the lines of the graph as an approximation of the Macintosh’s screen. Despite the limitation of available pixels, Kare found economical ways to provide the maximum amount of visual or metaphoric meaning within a tiny grid of space—all without using shading or color.
Next Wave Kare’s icons and digital fonts exist beyond the lifespan of the Macintosh, appearing in later Apple products and even early iPods. Iterations and mutations of her icon designs continue to define the visual shorthand of our desktops and software today, migrating across systems and platforms: NeXT Computers, IBM and Windows PCs. Have you ever played Solitaire on a Windows 3.0 computer? If so, you’ve played with Kare’s digital deck of cards.
A physical version of Susan Kare’s Windows 3.0 Solitaire game.
Have you ever sent a “virtual gift” over Facebook like a disco ball, penguin, or kiss mark? Again, this is the work of Kare, whose work has been quietly shaping our interactions with technology since 1984—making computers seem more friendly, more human, more convenient—one click at a time.
Visitors to the 1964 New York World’s Fair’s IBM Pavilion were submerged in a futuristic world made possible by computers. A world imaginatively conjured up by an intricately detailed fake newspaper with the headline “Computer Day at Midvale!”
The one-of-a-kind aluminum panel was created by the Eames Office, the studio of famed designers Charles and Ray Eames. Hand-painted with imagined newspaper headlines and draped with patriotic bunting, it hung on the back of one of the pavilion’s “Little Theatres” and was surrounded by lights, intended to lure visitors.
“The themes in the Midvale panel, and the IBM Pavilion on the whole, document a critical moment where people were being exposed to the culture of computing on a mass scale,” said Kristen Gallerneaux, The Henry Ford’s curator of communications and information technology. “Accessible systems like the IBM/360 were just around the corner, whose adoption would touch (and potentially disrupt) the lives of information and office workers. IBM needed to address this wariness of technology — they needed to humanize computers. The company found their solution in the playful visual communication skills of the Eames Office.”
Last year, The Henry Ford acquired the aluminum panel from its original owners, whose father, Robert Charles Siemion, had worked as an engineer and manager at the 1964 IBM Pavilion.
“The ephemeral nature of those fairs was such that most of the displays — and even the architecture — would be dismantled after the fair was over,” said Gallerneaux, who learned about the panel in an article on antique pricing. “But Siemion, as a manager, was invited to take home part of the pavilion as a memento. We’re lucky that he chose to salvage this panel and that his children knew to hold onto it all these years.”
The Eames Office employees who designed the pavilion are listed on the newspaper’s left in a credits area. The panel is among several IBM Pavilion-related objects The Henry Ford has acquired and the third such artifact associated with Charles and Ray Eames.
“Charles and Ray Eames were fascinated with the circus and early Americana, and there’s a wonderful sense of these themes coming together with high technology in the panel,” Gallerneaux said. “The IBM Pavilion was designed to send you into another head space so you could synthesize the concepts coming together at the time. It was an interesting collision of computing history and design history happening in one place.”
From a conservation standpoint, the panel, well maintained by its owners, only required minimal treatments. “It’s interesting to think about the public as stewards of material culture,” Gallerneaux said. “We acquire a lot of interesting collection items that way.”
The “Computer Day at Midvale” panel will appear in a future exhibit at The Henry Ford about communications and information technology.
DID YOU KNOW? The 1964 New York World’s Fair featured 140 pavilions spread over 646 acres. Continue Reading
Almost exactly two years ago, The Henry Ford embarked on a project to identify, conserve, photograph, catalog, rehouse, and make available online at least 1,000 items from our communications collections. This project was made possible through a generous $150,000 Museums for America grant (MA-30-13-0568-13) from the Institute of Museum and Library Services, or IMLS. Though we will continue to work on some straggler artifacts that have not yet made it through the entire process, the grant officially ended on September 30, with a total of 1,261 artifacts available online. One of the very last artifacts to be added during the official grant period was this computer trainer, used in the metro Detroit area in the 1960s to teach students to operate computers, a skill increasingly needed in the American workforce. You can see some of the other artifacts that worked their way through the IMLS grant process by browsing our digital collections for such communications-related artifacts as typewriters, radio receivers, phonographs, amplifiers, cameras, motion-picture cameras, mimeographs, and magic lanterns, among many others. We extend our thanks once again to IMLS for enabling us to make these significant collections accessible to everyone.
Ellice Engdahl is Digital Collections & Content Manager at The Henry Ford.