I was asked by a new colleague – he is new to the software industry – what I thought were the most significant points in the history of computing. Having a few hours to kill on a Sunday afternoon in rainy Scotland, I spent some time coming up with a list. Here are my top 28 milestones. It’s seems like there should be an even 30. What did I miss?
800 AD – India. Since you can’t have a computer without 1s and 0s, I think the invention of the number zero is significant. You can argue whether this happened in Egypt, Mesopotamia or India. In my opinion it was India as they were the first to treat it like a number and used the decimal point since year 595.
2. Pascal adding machine
1642 – France. Blaise Pascal builds the Pascal Adding Machine – the first workable calculator. To me this is more significant than Napier’s bones, the development of logarithm tables or some mechanical devices like the watch or the quadrant, because the device does the computing.
3. Binary number system
1679 - Germany. Gottfried Leibniz perfects the binary number system.
1751 – USA. Computers don’t work without electricity so Ben Franklin’s discovery in 1751 should make the list.
5. Textile loom
1801 – France. Joseph Jacquard builds his textile loom using the concept of a punch card to weave intricate designs into cloth. This is the foundation of a programmable machine.
6. Analytical engine
1833 – UK. Charles Babbage has the idea for the Analytical Engine, and although he didn’t build it, it sets the foundations for all modern computers. Augusta Ada Byran, AKA Ada Lovelace who worked with him, proposed using punched cards like Jacquard’s loom to make it programmable.
7. Boolean algebra
1854 – UK. George Boole creates Boolean algebra laying the foundation of Information Theory. This is where “and”, “or” and “not” come into mathematical formulas. This was later used by Charles Sanders Peirce to develop the idea that Boole’s logic lends itself to electrical switching circuits. It would be 50 years until Bertrand Russell presented the idea that this is the foundation of all mathematics and another 30 years until Claude Shannon incorporated the symbolic “true or false” logic into electrical switching circuits.
8. Thermionic emissions
1863 – USA. Thomas Edison discovers thermionic emissions, the basis of the vacuum tube, which, in turn, becomes the building blocks for the entire electronics industry. When the vacuum was invented in 1907 it enabled amplified radio and telephone technology.
9. Nipkow disk
1925 – UK. You could argue that the TV gets its roots from fax transmissions back in 1843, but when amplification made television practical, Scottish inventor John Logie Baird employed the Nipkow disk in his prototype video systems.
10. Automatic programming
1936 – UK. I have watched a few documentaries on Alan Turing and visited an exhibit in a museum here in the UK on him. Pretty amazing guy. He was the guy that provided the basis for the development of automatic programming showing that computing machines can simulate more complicated problems. If it wasn’t for him the Z2, the first digital computer that was used to break Germans’ Enigma, would not have built.
1948 – USA. John Bardeen invents the transistor.
12. Magnetic core memory
1949 – USA. An Wang invents magnetic core memory. He doesn't build it but sells the patent to IBM for $400K to get the funds to start his company. His concept turns out not to be practical, until Jay Forrester at MIT enhances the idea to put it into a matrix to give it greater practical applications. This is later developed into computer memory developed by Fred Williams.
1952 – USA. Grace Hopper pioneers the idea of using higher level computer languages and built the concept of a compiler so we could program in words not numbers and this gave rise to COBOL, the first language to run on multiple types of computers.
1953 – USA. The airline industry develops a semi-automatic business research environment (SABRE) with two connected mainframes, the start of computer networking. This project did borrow some logic from the military SAGE project. I feel it is the foundation of networking that really took off after Robert Metcalfe created Ethernet for Xerox, but the current internet gets it roots from ARPANET in 1969, which was the first network to implement TCP/IP and the ancestor of today's Internet.
15. John F. Kennedy
1961 – USA. John F. Kennedy launches his “I believe we should go to the moon” speech that put funding and research into computer science.
1963 – USA. The database is critical to today’s computing environment. The first reference I can find to a commercial database comes from General Electric’s release of IDS.
17. IBM System/360
1964 – USA. IBM releases the IBM System/360, the first computer system with the concept of a modular, compatible general purpose computer. This leads to the expansion of computer systems and the foundation of the personal computer market. Some could argue that it was the DEC PDP-11 developed in 1975 that really led to the PC market. The PDP-11 was just easier to program, had general-purpose registers, interrupts and could be manufactured with semi-skilled labor.
18. Human-computer interaction
1964 - USA. The first concepts of a mouse, graphical user interface and hypertext are created by Doug Engelbart. It wasn’t until 10 years later that Xerox PARC developed the Alto that was later stolen by Microsoft and Apple.
1964 – USA. Gordon Moore and Robert Noyce create Intel to build the integrated circuit. After forming the company it only takes them a year to proffer Moore’s Law.
20. First software patent
1968 – USA. The first software patent is issued to Martin Goetz. Without this the software industry could not have received the capital to develop.
1972 – USA. The video game market can be traced all the way back to 1948 with a checkers game built by IBM. But it really took off when Nolan Bushnell created Atari and the success of Pong (his second game as the first one was too hard to play). This is what got the younger generation and people of my age excited about the industry.
22. 8-bit 8008
1972 – USA. Intel releases the 8-bit 8008, soon replaced by the 8080, microprocessor. This was the first true microprocessor which led to the PC revolution.
1979 – USA. VisiCalc, the first electronic spreadsheet is created. This sets the stage for Lotus 1-2-3 and Excel years later, but it also spurred the need for having PCs on people’s desks.
24. PostScript language
1982 – USA. The idea of the PostScript language was conceived in 1976 by John Warnock. He joined Xerox PARC, which had developed the first laser printer and had recognized the need for a standard means of defining page images. He left Xerox and founded Adobe Systems in 1982.
25. World Wide Web
1989 – UK. The World Wide Web is created at the CERN physics laboratory by Sir Tim Berners-Lee. Although the paper was published in 1989, it was built in 1990 and the product was launched in 1991.
1992 – USA. Although Berners-Lee did build the first web browser, I feel that Mosaic is really the first consumer web browser.
27. RISC architecture
1985 – UK. ARM Holdings, which has a great business model, is the company that made smart phones possible. Building off of the RISC architecture it required fewer transistors, which reduced costs, power and heat. I feel this is the last major invention of the computing world we know today.
1993 – UK. The first tool used for searching on the Internet was Archie, created in 1990 by Alan Emtage. But JumpStation by Jonathon Fletcher used a bot to find web pages and to build its index. It used a web form as the interface to its query program and was the first engine to combine the three essential features of a web search engine: crawling; indexing; and searching. If it wasn’t for Search the tech boom we have all experienced would have not had the same velocity.
It will be interesting to take a look at this in 20 years’ time and see what gets added to the list.