Skip to Main Content

Robots: A History: Welcome- The History of Robotics

Using the Collections:

Linda Hall Library has many books dedicated to robotics, artificial intelligence, self-operating machinery, and the history of the development of these ideas and sciences. 

The history of robotics and the search for non-biological life is centuries old, and a new chapter is being written every day by scientists and engineers conducting new research. This guide briefly covers the history of this pursuit, from the Greek mythos to Mary Shelley's Frankenstein, to the amazing robots being used today. You will also be able to virtually browse an example of the many books available at the Library that cover robotics and its sister subjects.

As always, if you want more information about robots, robotics, or the history of the science, ask your librarian for assistance locating more books!

The Quest for Artificial Life

Robotics have a complex history, from the concept of manufactured life, through the development of the computer and electronic technology, artificial intelligence, and finally, the combination of all three into decision-making, aware beings. However, in order to fully begin with the history of robotics, we have to widen its beginning to include a time where machines- at least machines in the metal, fiberglass, and digital forms we know of today- were not thought of. The concept of non-biological life goes far beyond the modern idea of mechanical beings and artificial intelligence. Some of the earliest myths of the human race are centered around the concept of creating a being from a non-biological substance- clay, rock, brass, etc.

One of the best and earliest examples of this comes from the Greek mythos: Hephaestus. The god of fire, he was the divine smith who is known to have created Zeus’ lightning bolts. However, the tales also have him fashioning human-like beings out of gold and bronze. In The Iliad, the narrator tells of Hephaestus fashioning moving, speaking, and thinking girls out of pure gold, so he could have assistance in his forge. Also in his forge were twenty tables that moved without guidance. As for the being made from bronze, it was said that King Minos of Crete commissioned the god to create a guard for the island- one that would circle the island tirelessly, keeping it from invasion.

Another Greek ruler, Pygmalion- King of Cyprus, was thought to have created a statue in the figure of a woman so lovely that he fell in love with her. Unable to rid himself of his infatuation, the Greeks said that he prayed unceasingly to Aphrodite, the goddess of love, who eventually granted his wish and brought the statue to life so that Pygmalion and his bride could live happily ever after.

The creation of human beings was also said to be a work of the gods- Titans to be exact. Prometheus, one of the Titans defeated by Zeus, was said to have fashioned man out of clay, then bestowed upon the creations his gifts of fire and technology. Unfortunately, this myth ends a bit sadder than the others. Because Prometheus neglected to consult Zeus before doing this, he was punished- not with death, as Titans are immortal- but with eternal suffering. Bound to a mountain, Prometheus was fated to have an eagle gnaw at his liver every day, and every night, a new one took its place.

Of course, the concept of this creation of life isn’t limited to the Greek mythology. In Jewish folklore, there is a being called a golem, or “shapeless mass,” in Hebrew. A golem is said to be a creature created by magic. The tales vary in exactly how a golem is made, however in each tale, a golem is created out of soil or clay, and it can only be brought to life if the secret name of God is uttered. Unlike the idea of a being that can think for itself, a golem only exists to do the will of its creator. Once its duty is finished, it either dissolves back into dust, or is released back into its original form (soil/dust/clay) by its creator once again uttering the secret name of God.

    Also drawing from the idea of a golem is Mary Shelley’s first novel, often heralded as the first ‘true science fiction work,’ from 1818- Frankenstein: The Modern Prometheus, in which is told the story of the ill fated Dr. Frankenstein, who creates a “monster” from parts of once-dead humans. Shelley, instead of drawing from the supernatural to create life, had her doctor draw from the scientific, although in the novel’s subtitle “a modern prometheus,” she purposefully references Greek mythology’s creator of man.

Early legends associated with the Catholic church also had tales of mechanical men being constructed, although these took on a more cautionary tone. English friar and alchemist Roger Bacon was said to have created a talking brass head. Unfortunately, forgetting a warning given by a sympathetic spirit, the head's words went unheard, and it crumbled into dust. Another alchemist, Albertus Magnus, created an entire metal man that could answer questions and solve problems. He too, met an untimely end, when Saint Thomas Acquinas smashed the creation to pieces, calling its creator, Magnus, a sorcerer.

As mankind's knowledge of mechanics and the human body grew, the concept of artificial life grew more mechanical with it. Automatons were object fashioned to move automatically (hence the name), mimicking every day actions. These automatons could be made to look like human figures or animals, having a set series of movements programmed into them thanks to complex clockwork inner workings. Some of the more famous automatons are:

  • Jacques de Vaucanson’s Duck: Fashioned from copper in 1738, the duck could quack, bathe, drink, eat, and “digest” the food and drink. While the original duck was destroyed when a fire ravaged two wings of the Krakow Museum in the 1800s, a copy currently sits in the The Musée des Automates (Museum of Automantons) in Grenoble.

  • Pierre Jacquet-Droz’s Automatic Scribe (1774): Fortunately, Droz’s scribe still exists, and can be seen at the Musee d’Art et d’Histoire in Neuchatel, Switzerland. The boyish scribe dips his quill in an inkwell and writes a letter. Unlike the duck, the writer looks much more lifelike, with blinking eyes and rosy cheeks. Droz was a watchmaker, and it’s evidenced in the immense complexity of the mechanical doll. There are over 6,000 moving parts, and the automata can be programmed to write whatever the operator wishes, so long as it is limited to 40 letters of text.

  • Wolfgang von Kempelen’s Automated Chess Player “The Turk” (1770s): Kempelen created his chess playing automaton for the Archduchess Maria Theresa of Austria. The machine, fashioned to look like a man clad in Turkish robes bent over a chess board, seemed to be able to conduct rational thinking and make logical decisions, successfully beating any human competitor who sat down across from him to play a game of chess. The automaton could seemingly counteract unpredictable human behaviors, notice a cheating opponent, and if cheating continues, would swipe his arm across the board, scattering the pieces. Touring Europe successful for many years, and impressing the likes of Benjamin Franklin and Napoleon. Years later, it was revealed that The Turk’s decisions and movements were made by a hidden controller inside the four-by-two and a half-by three foot box underneath the figure using a series of switches and levers.

Three years after selling his duck, Vaucanson decided to focus his mechanical mindset to more useful & practical pieces of machinery. In 1745, he became the first to invent a working automatic weaving loom. His control system eventually gave birth to the punch cards and tapes of today, and while programming the machine was a long, complex process, once done, it could be easily changed. It is this computerization that makes a machine a robot.

About half a century later, in 1801, Joseph Marie Jacquard took Vaucanson’s automated loom and improved the idea by using a series of holes punched through blocks of wood to control the pattern of needles. This greatly reduced the time spent weaving and increased production output, and the loom soon had over 10,000 copies throughout France, eventually spreading into Great Britain after the end of the Napoleonic wars.

I’m sure you’re wondering what looms have to do with robotics, however it’s all tightly interwoven. Jacquard’s development of the punch card & tape method eventually gave English mathematician George Boole the idea for “Boolean algebra” which is an analog of the rules of logic, using punched or un-punched holes as a ‘yes-no’ formation. ‘Yes’ (on), being the equivalent of 1, and ‘no’ (off) being 0 in the binary system. He officially published his system in 1847, not realizing that it would become the basis for all digital computers today.

The first full step made towards digital machinery was thanks to Charles Babbage, who created a digital calculating machine in 1823, for use by the British Post Office. Unfortunately, Babbage did not have the ability to mimic Boolean algebra, and was forced to use slower mechanical switches. He worked on two prototypes: a “Difference Engine” and an “Analytical Engine.” The latter was later on, in 1836, however neither saw completion, which was a shame, because the notions behind each were completely sound. (Precursors to Babbage’s machine were seen in other various calculating machines such as: Blaise Pascal’s machine that could perform simple addition and subtraction, Gottfried Wilhelm von Leibnitz’s calculator that could multiply. The type of mechanical cash registers used prior to World War II were based off of these mathematical machines.)

The one thing that consistently set Babbage’s inventions awry was simple: his designs were not practical without the use of electricity and electronics, so they were quite forgotten for around a century, until his principles were rediscovered independently and gave birth to calculating machines.

The US censuses in 1890 and 1900 saw the first successful use of electromechanical statistics machines, thanks to statistician Herman Hollerith, who worked out a way of recording statistics by using electrical currents to automatically count & perform other calculations by forcing the currents through mechanically formed holes placed in certain positions on cards that were run through the machine. However, even with this device, the 1890 census took two and a half years to calculate (but by 1900, Hollerith had improved his device so it could have cards fed automatically, decreasing the amount of calculating time down to around a year and a half, despite the increase in information). Hollerith eventually founded a firm that is now known as IBM which, over the next several decades, improved electromechanical computations dramatically.

Over the next several decades, the growing quantity of information increased so quickly that the necessity of more adept machinery was necessary, or mankind would be completely unable to adequately handle it without help. As Plato said centuries earlier, “Necessity is the mother of invention,” and faced with the pressure of needing to handle the ever-growing quantity of information, science moved forwards in leaps and bounds. Calculating machines became faster and more reliable, but when one problem was fixed, another was created: mainly, cost & space. The more information a machine could handle, the larger it had to be.

As electromechanical technology developed, and computer science began to slowly emerge as a research field, science fiction began to develop as well. The 1920s - 1940s can arguably referred to as the “golden age” of science fiction, with authors starting to stretch the imagination with technology, not just fantasy. The more computers were involved in technological breakthroughs, the more they were involved amongst the pages of well-read books as well, and with robotics, literature has as much of a hold on the technology as scientific development does. 

Until the beginning of the Twentieth century, the term “robot” wan’t even in the English language. It was first introduced via a play written by Karl Capek in 1920 called, “R.U.R” or “Rossum’s Universal Robots,” in which automata were mass produced and used as labor. The play, instead of using the already known term “automaton,” used a Czech word for “serf/slave,” robot. From this point forwards, “automata” was slowly replaced with the word “robot” to refer to any artificial device created to perform a set of programmed functions.

Computers, or machines that closely resemble the machines we know today, first started to appear around the year 1930, when Vannevar Bush invented the first analog computer- a fully automatic calculator, within the halls of the Massachusetts Institute of Technology (MIT).

1937-1944 was the construction period for a machine built by Howard Aiken for IBM called the Automatic Sequence Controlled Calculator, or Mark I, as it was known at Harvard. The Mark I, while very large and slow compared to today’s devices, was the first digital computer, being able to perform mathematical functions that involved integers with up to twenty-three decimal places. Since the Mark I was electromechanical, it still used punch cards as well as electronic switches to control the electronic waves.

During the time period that the Mark I was being constructed, the world’s first “robot” made its appearance at the 1939 New York World’s Fair. Elektro & his dog companion, Sparko, were relay based robots built by the Westinghouse Electric Company. The 7 foot tall mechanical man responded to voice-commands, “speak” thanks to pre-recorded 78rpm records, move his head & arms, and even blow up balloons and “smoke” cigarettes. He’s now on display at the Mansfield Memorial Museum.

Here is where computer history, robotics history, and the history of robots in science fiction begin to merge. Without the development of computer technology, robotics would not exist in the way it does today. Without certain imaginative developments from within the science fiction realm, robotics wouldn’t have developed the same way they have today. As science began to advance, so did science fiction. So with this in mind, the three histories seemingly merge into one track, all ending at the same destination.

    The 1940s were alive with discoveries in the fields of science and technology, pushing the development of the robot as we know it further to reality. In 1941, a young author & biochemist published a science fiction short story titled Liar! Within the plot, Isaac Asimov, now known as one of the father’s of science fiction, introduced what he called the Three Rules of Robotics. Asimov’s rules, although simple enough, were instrumental in changed the world’s perception of science and robots for two reasons: First, it is the first known usage of the term “robotics,” as a scientific research field, and second, his rule broke the pattern of robots being thought of as unfeeling machines, programmed for a singular purpose, and acting inherently good, or inherently evil- a far cry from the machines we know and love on screen and in books today.

That same year, computer technology pioneer Alan Turing (of the Turing test- a foundation for artificial intelligence theories) and Harold Keen (of the British Tabulating Machine Company) completed the first Bombe, a machine designed specifically to decode encrypted Nazi messages.

The first fully electronic computer was not created for another two years, when the University of Pennsylvania’s John William Mauchly & John Presper Eckert, Jr. unveiled ENIAC in 1946. ENIAC used vacuum tubes, which needed constant monitoring and replacing, and took up an unseemly amount of space and energy. For example, ENIAC weighed over thirty tons, consumed around 200 kilowatts of power. Until William Bradford Shockley and a team of scientists at Bell Telephone Laboratories invented the transistor in 1948, vacuum tubes were the only way to complete circuits. The transistor allowed more complex circuits to be built, so computers that were built post-World War II were immensely faster and far superior to the models that used vacuum tubes, although they would still be extremely large & cumbersome for quite some time. Several developments needed to be made in order for computers to start shaping into the designs we are familiar with today.

In 1951, MIT researchers had started to experiment with direct keyboard input, doing away with the punch cards and tape that were still being used to feed information into the device. This development allowed computers to be thought of as more than calculating devices. This, combined with Jack Kilby’s invention of the integrated circuit in 1958, allowed for the machines to begin to have the capability to handle larger amounts of information in a smaller amount of space. (Kilby won a Nobel prize for his invention in 2000, and their website has a great abbreviated history here!)

This can start to be seen in the late 50s when, at the height of the Cold War, SAGE, the first large-scale computerized communications network, was turned on. SAGE allowed sites in both the United States and Canada to communicate concerning incoming Soviet aircraft. Not soon after, in 1962, one of the a “personal computer” was introduced to the world by, once again, scientists from MIT. The device was called LINC, and was one of the first examples of a device specifically designed for an individual user. Just two years later, in 1964, American Airlines & IBM put the SABRE system (which recycled IBM’s earlier work on the US military’s SAGE system) online, becoming one of the most publicized computerized reservation systems available to the public. In fact, the mid-60s were ripe with new computer systems and developments. The first mini-computer, the DDP-116, was announced in 1965, HP first comes on the computer history timeline in 1966, and the AGC- Apollo Guidance Computer, debuted in 1968. Weighing only 7 pounds, the computer’s first flight was on Apollo 7, and one year later, was responsible for the Apollo 11 moon landing.
Computers began to get smaller, cheaper, and able to perform more functions than just simple codes. 1971 was quite a big year for firsts: HP released their very first handheld scientific calculator, the HP-35, the first microprocessor was developed and put online by Intel, and Xerox put their name on the map by developing and manufacturing the first laser printer.
Throughout the next decade, computers began to appear in homes across the globe- now with the ability to be useful in fields other than the sciences. The Xerox Alto was released in 1974, and this groundbreaking machine had the ability to work with other Alto computers over a shared network, had an interface that interacted with a keyboard and a mouse, and simple programs such as a word processor, a painting tool, and could print and share files on a Xerox printer. Apple’s Macintosh computer was later based on this idea. In fact, two years later, Steve Wozniak completes the Apple I, a single-board computer designed for hobbyists. The Apple II followed quickly behind, being released in 1977.
From here, things happened in rapid succession. Atari’s first game console was released, followed by two computer models in 1979, Motorola’s 68000 microprocessor was released in the same year. IMB’s Personal Computer, the PC was in 1981, with Apple’s Lisa computer following in 1983, and their Macintosh in 1984, Michael Dell founded PC’s Limited in the same year. Throughout the 80s, both Apple and IBM continued to release more computers, quickly cementing the two companies as main competitors in the computer world. In 1989, Apple released their first movable computer- the Macintosh Portable. Unfortunately, the design was 16 pounds and expensive, so the line didn’t last long. Their Powerbook Laptop, released in 1991, was much more successful than its predecessor, the Portable. Apple only discontinued the Powerbook line in 2006.
Remember Charles Babbage’s Difference Engine? In 1991, scientists reconstructed his design, proving that it was completely accurate. 1996, the first Palm Pilot was introduced, ushering in an age of handheld computers that would eventually lead to today’s smartphones. Those candy-colored Mac desktops? They were released two years later, in 1998, and were met with roaring success. Camera phones were introduced in 2000, Apples G5 computer in 2005, and from here, it becomes modern history.

With the continuous development of computers, robots become easier to build, imagine, and mass produce. Toys such as the Furby and Sony’s AIBO were put on the market with varying success, and robots have been sent on missions as deep as the bottom of the ocean to the far reaches of space. They’ve appeared in tv series and comics, and have become some of the most iconic characters and figures in pop culture. The field of robotics only continues to grow, and to entwine with the fields of artificial intelligence and computer engineering.

These timelines are amazing resources if you want to learn more about artificial intelligence & robotics, or computer history, or you can ask your reference librarian for directions to books in our collection!

Alchemy and the Mystic Sciences

Alchemy, the medieval forerunner to chemistry, was an interesting combination of mysticism and science, based around the goal of the transformation of matter, with the ultimate goal being the discovery of an elixir of life. Two such alchemists, legend is told, were to have successfully created thinking, talking "brazen men" or men fashioned from bronze.

The first was Albertus Magnus, a bishop, alchemist, and the man credited with discovering arsenic. Albertus who labored and worked for thirty years in order to create a metal statue, or androides, who “solved all problems, and cleared up all difficulties for its author.” (Chambers, Ephriam, 1680-1740, et al. A Supplement to Master Chambers’ cyclopedia; or, Universal Dictionary of Arts and Sciences. In Two Volumes. 1753) However, when he, in his excitement, called another monk into his chambers to behold his masterpiece, it was dismissed with horror, being called a think of necromancy, a diabolical art, an evil science. Later, the man was crushed, smashed to bits by Saint Thomas Aquinas. In one of the earliest texts found to contain his legend, Rosaio della vita, his tale is told as a warning to readers who think they have more wisdom than God.

The second alchemist was the English friar, and creator of spectacles, Roger Bacon (1214-1294). Toying with the idea of a mechanical man, Bacon was supposed to have build a talking brass head with the help of a conjurer. Wanting to surround England with a brass wall, the team of two constructed the head over a span of 7 years, believing that it would teach them how to complete the gargantuan task. However, when the head was fully constructed, finally ready, it wasn’t able to speak. Bacon and his fellow friars attended to the head night and day, after being told by a spirit that the head would eventually speak, but if they failed to hear it, their labor would have been in vain. Once the friars were exhausted, they hired a guard to watch while they slept. One night, during the guard’s watch, the head uttered the words “time is.” but the guard deemed the phrase too trivial to wake the friars. Half an hour later, the head spoke again, “Time was.” but was once again ignored by the guard. At the passing of an hour, the head spoke a final time: “Time is past” before collapsing into dust. (San Francisco Call, Volume 80, Number 77, 16 August 1896.)

As are many legends and tales, Bacon’s foray into the forbidden sciences was written as a fictional story, then later turned into a comedic play, The Honorable Historie of Friar Bacon and Friar Bongay.

Linda Hall Lectures

Linda Hall Library is proud to be able to share past lectures in digital format! To keep up to date with the most recent lectures going on at the library, you can visit our Events page here! If you're unable to attend, you may catch a livestream of the lecture (if provided) here!

About this lecture:
August 14, 2013, in the Linda Hall Library Auditorium
Dr. Dustin Abnet, PhD, Indiana University, and Visiting Professor, Grand Valley State University.

Dr. Dustin Abnet uses the history of robots and automata to explain changes in how Americans imagined the relationship between race and technology from the late 18th century to the present. Examining exhibition automata, toy catalogs, science and engineering periodicals, films, and other sources, he shows how imagining robots in the forms of different types of people helped Americans address the tensions created by the rise of industrial capitalism and the emergence of modern science and technology.