The story of NASA’s Apollo program is often seen as a peak moment that soon lost its direction. Critics mention the absence of moon bases, Mars landings, or a network of orbital stations, calling the program aimless after the 1969 moon landing. But seeing Apollo this way misses its biggest success: kickstarting the digital age.
Many stories about Silicon Valley’s rise skip over NASA’s role, treating the space program and the digital revolution as two separate stories. This misses the point that NASA’s Apollo program gave a huge boost to early digital technology.
Here’s how.
The Dawn of the Microchip Era and Digital Age: NASA’s Groundbreaking Use of Computer Chips
In the 1960s, NASA was one of the first to use integrated circuits, the first computer chips, in the Apollo command and lunar modules. While these chips were barely three years old, NASA took a bold risk in adopting them when even major companies like IBM hesitated.
NASA’s high demand and strict standards for these chips helped create a global market and slashed prices by nearly 90 percent in five years.
In a groundbreaking move, NASA was the first to rely on computer chips for human safety. If these tiny circuits could safely guide astronauts to the moon and back, it made sense to use them for everyday applications, from running factories to analyzing data. NASA’s trust in these chips opened the door for their widespread use.
Real-Time Computing
The space program introduced Americans and the wider world to the power and potential of digital technology. As millions watched Mission Control staff use computers to guide spacecraft to the moon, they gained a firsthand understanding of technology’s transformative capacity.
One of NASA’s significant contributions was the introduction of “real-time computing”. This concept may seem commonplace to the modern computer user, but in 1961, when the Apollo Program started, it was quite revolutionary. Before this, computations involved submitting programs on stacks of punch cards and waiting hours, if not days, for the results.
In contrast, the Apollo spacecraft, hurtling towards the moon at 24,000 miles per hour (nearly 37,000 km/h), demanded instantaneous calculations. With this requirement, real-time computing was born, implemented on machines that took up only a single cubic foot, a remarkable accomplishment in an era when batch-processing machines filled vast rooms.
Thus, the Apollo program, far from being a glorious relic of an aimless space program, holds its legacy in every microchip and real-time computation that powers our digital age.
User-friendly computer interface
Another critical revolution set in motion by the Apollo program was the transformation of how individuals, with no intent or expertise in computer science (in this case, astronauts), interacted with computers. This was exemplified through the introduction of the Apollo Guidance Computer’s (AGC) interface, a compact display and keyboard unit known as the Display and Keyboard (DSKY).
The astronauts communicated with the AGC using the DSKY by entering concise numeric codes that corresponded to specific programs they wished to initiate. The DSKY, in turn, responded through five lines of numeric displays and a small panel of sixteen labeled lights. This rudimentary but effective system served as a critical channel of communication between humans and machines.
The DSKY fundamentally revolutionized the dynamic between humans and computers. In our present era, characterized by touch-sensitive screens, graphical user interfaces, and voice commands, we effortlessly interact with computers without needing a deep understanding of their intricate mechanisms. But let us travel back to the early 1960s; this level of accessibility and user-friendly interaction with computers was a groundbreaking concept.
In those times, computers were predominantly the realm of specialists; they were complex, intimidating machines that required specialized knowledge to operate. However, the advent of the DSKY was transformative. It democratized computer access, creating an interface that enabled even non-specialists to interact with and command the machine. This shift was not just radical; it laid the foundation for the user-centered design philosophies that are at the heart of the digital interfaces we interact with today. The DSKY and the Apollo program, more broadly, played a pioneering role in shaping this user-centric digital landscape.
The Birth of Software Engineering at NASA
Margaret Hamilton, an unsung hero of the Apollo missions, played an instrumental role in the shift to the digital age. Born in 1936, Hamilton earned a degree in mathematics with a minor in philosophy from Earlham College. After graduation, she worked as a high school teacher, at an insurance firm, and as a systems programmer at MIT.
Her true calling emerged when she joined the Semi-Automatic Ground Environment (SAGE) project at the Lincoln Laboratory, working on the nation’s first air defense system. This role introduced Hamilton to the world of software engineering, a discipline she would come to define.
When the Apollo program was launched, Hamilton became the Director of the Software Engineering Division at the MIT Instrumentation Laboratory, which developed onboard flight software for the Apollo missions. She led a team of programmers in developing the code for the Apollo Guidance Computer. This computer was a pivotal part of the Apollo missions, controlling the spacecraft’s command module and the lunar module. Not only did Hamilton contribute to the coding process, but she also developed rigorous testing and verification procedures, which would later become standard practice in software development.
Margaret Hamilton’s influence extends beyond the realm of the Apollo missions. Hamilton coined the term “software engineering” itself. She advocated for the recognition of software engineering as a legitimate discipline at a time when the term was not widely recognized or respected. Her efforts laid the groundwork for software engineering to be acknowledged as a vital field in its own right.
Hamilton’s pioneering work in system design and software development has profoundly impacted modern computing. Her contributions also highlight the significant role women played in the space race and the digital revolution, a narrative that is often overlooked.
Shaping the Future: NASA’s Pioneering Role in Digital Photography
Another remarkable contribution by NASA that revolutionized the way we capture and store memories today was the advent of the digital camera. While the physical manifestation of the first digital camera was developed by Eastman Kodak in 1975, the seeds for its conceptual birth were sown within the hallowed corridors of NASA in the 1960s.
Eugene F. Lally, a NASA engineer, a photography enthusiast since his teens, and the mind behind a popular solution to reduce red eye in photography, proposed the concept that would eventually materialize as the digital camera. Lally envisaged a mosaic array of optical sensors that could sense, digitize, and transform light signals into electronic images. This electronic image could then be fed into a computer that could track celestial bodies in real time during missions. This was analogous to sailors navigating by the stars using a sextant in bygone days.
The phrase “Digital Photography” was coined by Lally while working at the Jet Propulsion Laboratory (JPL). Interestingly, another NASA engineer at JPL, Frederic Billingsley, introduced the term ‘pixel’, short for picture element, in 1965.
Building on Lally’s research, the first digital cameras were built using solid-state based sensors, primarily Metal Oxide Semiconductor (MOS) capacitors, which converted photons into an electrical charge. The charges were stored and released in a manner that allowed the construction of images, much like film cameras. By the 1970s, Charge-Coupled Devices (CCDs) had become the standard for electronic cameras.
Despite these advancements, CCDs remained analog devices. The requirement of additional circuitry for converting the charge to digital information made them power-intensive and susceptible to radiation, especially in space. This marked a crucial turning point in the quest for more compact, lightweight, and energy-efficient cameras for long-term space missions.
In the 1990s, a NASA JPL team led by Eric Fossum picked up the baton and focused on the Active Pixel Sensor technology initially developed by Olympus in Japan. This technology significantly reduced the power consumption, shrinking it to less than 1/100th of that of a CCD sensor, without compromising the image quality.
NASA’s use of complementary metal-oxide-semiconductor (CMOS) rather than the PMOS used by Olympus led to cheaper and more efficient image sensors. They also built additional functionalities directly onto the chip, enabling a radical reduction in the size of the cameras and power consumption. These CMOS cameras could also be radiation-hardened for use in deep space.
Recognizing the potential of this technology beyond spacecraft, Fossum, along with three other JPL engineers, founded Photobit to commercialize their discoveries. Their technology, which was licensed exclusively from JPL, paved the way for the first megapixel sensors capable of over 500 frames per second. The advancements were soon adopted in the film industry, medical imaging, weapons imaging, and more, leading to the creation of consumer digital cameras, webcams, and mobile phones.
Thus, what began as NASA’s quest for better cameras for space exploration laid the groundwork for the digital cameras we take for granted today. This once again highlights how NASA’s innovative research and development often spirals out into products that have profound impacts on our everyday lives.
Sources
- “What You Didn’t Know About the Apollo 11 Mission” on the Smithsonian Magazine website
- “Apollo Technology: Back to the Future” on the NASA website
- Margaret Hamilton on Wikipedia
- “Moon-landing innovations: 12 items we wouldn’t have without space travel” on the CEO Magazine website
- “5 Moon-landing Innovations that changed life on Earth” on The Conversation website