Source Details
- Video Title:
- Early Computing: Crash Course Computer Science #1
- Channel/Author:
- CrashCourse
- Publication Date:
- March 2, 2017
- URL:
I. Introduction: The Driving Force Behind Computation
The 20th century witnessed an unprecedented "explosion of complexity, bureaucracy, and ultimately data," fueled by global population growth, large-scale conflicts like World War I and II, and advancements in trade and scientific endeavors. This increasing complexity created an "increasing need for automation and computation," leading to the development of early computing devices. Initially, these were "special purpose computing devices, like tabulating machines," which aided governments and businesses by "aiding, and sometimes replacing, rote manual tasks." However, the limitations of these early machines spurred continuous innovation in search of faster, more reliable, and more general-purpose computing solutions.
II. Early Electro-Mechanical Computers: Power and Limitations
The initial response to the growing computational demand was the creation of "room-sized behemoths" known as electro-mechanical computers.
A. The Harvard Mark I (1944)
- Scale and Purpose: One of the most significant electro-mechanical computers was the Harvard Mark I, built by IBM for the Allies during World War 2. It was immense, containing "765,000 components, three million connections, and five hundred miles of wire." Its internal mechanics were synchronized by a "50-foot shaft running right through the machine driven by a five horsepower motor."
- Early Applications: Early uses included "running simulations for the Manhattan Project."
- Core Mechanism (Relays): The "brains of these huge electro-mechanical beasts were relays: electrically-controlled mechanical switches." A relay functions like a "water faucet," using a "control wire" to open or close a circuit. When current flows through a coil, an "electromagnetic field is created, which in turn, attracts a metal arm inside the relay, snapping it shut and completing the circuit."
B. Limitations of Electro-Mechanical Relays
Despite their groundbreaking nature, relays had significant drawbacks:
- Slow Switching Speed: The mechanical arm in a relay "has mass, and therefore can’t move instantly." In the 1940s, a good relay could only "flick back and forth fifty times in a second." This resulted in very slow operations: the Harvard Mark I could perform "3 additions or subtractions per second; multiplications took 6 seconds, and divisions took 15." More complex operations, like trigonometric functions, could "take over a minute."
- Wear and Tear/Unreliability: As mechanical parts, relays were prone to "wear over time," leading to breakage or unreliability. With a machine like the Harvard Mark I having "roughly 3500 relays," the probability of failure was high. It was estimated that "you’d have to replace, on average, one faulty relay every day!" This made "important, multi-day calculation" extremely challenging.
- "Bugs": These large, warm machines also attracted insects. In September 1947, a "dead moth" was found in a malfunctioning relay on the Harvard Mark II, leading Grace Hopper to note, "From then on, when anything went wrong with a computer, we said it had bugs in it." This is the origin of the term "computer bug."
III. The Shift to Electronic Computing: Vacuum Tubes
The clear need for a "faster, more reliable alternative to electro-mechanical relays" led to the adoption of vacuum tubes, marking the "shift from electro-mechanical computing to electronic computing."
A. Development of Vacuum Tubes
- Diode (1904): English physicist John Ambrose Fleming developed the first vacuum tube, a "thermionic valve," which housed "two electrodes inside an airtight glass bulb." One electrode, when heated (thermionic emission), emitted electrons, while the other attracted them if positively charged, creating a one-way current flow. This was the first "diode."
- Triode (1906): American inventor Lee de Forest improved upon this by adding a "third 'control' electrode." By manipulating the charge on this control electrode, the flow of electrons could be permitted or prevented, effectively creating an electronic switch.
- Advantages over Relays: Vacuum tubes had "no moving parts," leading to "less wear" and, crucially, the ability to "switch thousands of times per second."
- Limitations: While a significant improvement, vacuum tubes "weren’t perfect - they’re kind of fragile, and can burn out like light bulbs." Initially, they were also expensive, but "by the 1940s, their cost and reliability had improved to the point where they became feasible for use in computers… at least by people with deep pockets, like governments."
B. Pioneering Vacuum Tube Computers
- Colossus Mk 1 (1943): The "first large-scale use of vacuum tubes for computing" was the Colossus Mk 1, designed by Tommy Flowers for code-breaking at Bletchley Park, UK, to decrypt Nazi communications. It contained "1,600 vacuum tubes" and is "regarded as the first programmable, electronic computer." Programming involved "plugging hundreds of wires into plugboards."
- ENIAC (1946): The Electronic Numerical Integrator and Calculator, or ENIAC, was completed by John Mauchly and J. Presper Eckert at the University of Pennsylvania. It was the "world's first truly general purpose, programmable, electronic computer." ENIAC was incredibly fast for its time, performing "5000 ten-digit additions or subtractions per second." Despite its power, "with that many vacuum tubes failures were common, and ENIAC was generally only operational for about half a day at a time before breaking down."
IV. The Transistor Era: Miniaturization and Reliability
By the 1950s, even vacuum-tube-based computing was "reaching its limits" in terms of cost, size, and reliability. A "radical new electronic switch" was needed, leading to the invention of the transistor.
A. The Transistor (1947)
- Invention: In 1947, Bell Laboratory scientists John Bardeen, Walter Brattain, and William Shockley "invented the transistor, and with it, a whole new era of computing was born!"
- Mechanism: A transistor is also a switch, similar to relays and vacuum tubes, controlled by an electrical current. It uses a "semiconductor" material and a "gate" electrode to manipulate conductivity, allowing or stopping current flow.
- Key Advantages:
- Speed: Even the "very first transistor at Bell Labs" could switch "10,000 times per second," significantly faster than vacuum tubes. Today's transistors can switch "millions of times per second."
- Durability: Unlike fragile vacuum tubes, transistors were "solid material known as a solid state component," making them far more robust. They "can run for decades."
- Size: Almost immediately, transistors "could be made smaller than the smallest possible relays or vacuum tubes." Today, they are "smaller than 50 nanometers in size."
- Cost: Transistors were dramatically "cheaper" than previous components.
B. Transistor-Powered Computers
- IBM 608 (1957): The "first fully transistor-powered, commercially-available computer" was the IBM 608. It contained "3000 transistors" and could perform "4,500 additions, or roughly 80 multiplications or divisions, every second."
- Widespread Adoption: IBM quickly "transitioned all of its computing products to transistors, bringing transistor-based computers into offices, and eventually, homes."
C. Silicon Valley
The development of transistors and semiconductors, often made from silicon, led to the concentration of related industries in the Santa Clara Valley in California, which "soon became known as Silicon Valley." This region became a hub for innovation, with William Shockley founding Shockley Semiconductor, whose employees later founded Fairchild Semiconductors, and whose employees in turn founded Intel, "the world’s largest computer chip maker today."
V. Conclusion: The Foundation of Modern Computing
The progression from cumbersome, slow, and unreliable electro-mechanical relays to fast, durable, and tiny transistors laid the fundamental groundwork for all modern computing. This technological leap allowed for an exponential increase in computational power, paving the way for the complex digital world we inhabit today. The ability to "turn electricity on and off really, really, really fast" through these evolving switching mechanisms is the core principle upon which all subsequent computer advancements are built.