A brief history of the computer
With computers now commonplace in every home, workplace and pocket, Simon Handby traces the development of the technology that changed the world
POST-WAR
The next three decades would see numerous inventions and innovations in electronics that would set a pattern for computer technology that continues today: as technology improves, computers increase in complexity, affordability and operational power, while their heat and power consumption fall. Some early milestones included the 1949 invention of random access memory (RAM) and the development in 1952 of the trackball by the Canadian Navy, but computers remained the preserve of governments, universities and large corporations who could afford the hardware and the expert staff to operate and maintain them.
The first trackball works very similarly to today’s examples, although it’s not as ergonomic
One of the single most important breakthroughs happened in 1947, with the building at Bell Labs of the first working transistor – a semiconductor device that can perform the same functions as a valve. Although it was some years before the technology was refined, the first transistor computer appeared in 1953 and heralded the start of a second generation of more sophisticated machines. However, while the first fully-transistorised computer appeared in 1957, a second major innovation at the end of the 1950s would play an equally important role in pushing computers towards the hands of the masses.
CIRCUIT TRAINING
While the earliest transistors were self-contained components, smaller than a valve but still challenging to build into a complex device, in 1957 an engineer at Texas Instruments, Jack Kilby, was working on ways to modularise them so that they could be assembled in grids. Kilby subsequently hit on the idea of building multiple components on a single piece of semiconductor substrate – the essence of the integrated circuit (IC). He built the first working IC from germanium, while in 1959 Robert Noyce independently built the first silicon example. Kilby’s discovery proved so revolutionary and important that by his death in 2005 he had received the Kyoto Prize, the Nobel Prize in Physics and had been awarded no fewer than nine honorary doctorates.
By 1962 simple ICs containing just a few transistors were being manufactured in small numbers at high cost and were almost solely used in ballistic guidance systems. However, growing demand helped reduce costs and improve manufacturing processes. Chips came with more and more transistors, by 1965 prompting Intel co-founder Gordon E. Moore to coin his famous Law. Originally Moore’s Law said that the number of transistors on a chip would double every year, although he later revised it to a doubling every two years – an estimate that has proved uncannily accurate. By the end of the 1960s, ICs were being mass-produced and the most advanced chips contained hundreds of transistors.