In November 2008 I was asked by The Sunday Times to write an article on the History of Computing - they were working on a Technology Supplement at the time. The opportunity for me to write for a newspaper doesn't come around too often, so I seized the opportunity to write the article. I put quite a lot of effort into the article, and the Agency who were working on the supplement kept the pressure on to get the article done. I submitted the article on time, and much to my disappointment I never heard from the Agency again - nor was the article (or the Supplement) ever published. I guess the Sunday Times pulled the plug on the Supplement - I didn't even get the courtesy of even a short note thanking me for the article but that it was not to be published. My journalism career was over before it started.
Going through some old documents on my computer I came across the article. As it is now well over a year since I wrote the original article I no longer consider that this is owned by The Sunday Times and I now publish the original article here in full. I am presenting the lunch time seminar on Friday in NCI on the subject of self-publishing - so there is method in my madness in publishing this article here. I will be mentioning this post on Friday. It is far too long for a blog post, but here goes anyway...
Technology Today - How Did We Get Here?
Technology is everywhere today, so much so that it is difficult to imagine a time when there were no desktop computers, laptops, satellite TV, mobile phones, digital cameras, MP3 players, and gaming devices around. Only a few short years ago we hadn't heard of tools such as the World Wide Web, email, spreadsheets, databases, search engines, blogs, and podcasts. We are now so used to common names such as Google, iPod, YouTube, Facebook, and Bebo, that some have become part of the English language. Just look at a movie like All the President's Men made just over thirty years ago in 1976 and in the office scenes you will see Robert Redford working on a typewriter with not a computer in sight. Contrast that with today where you will see computers on every desk in any office, and not a typewriter in sight.
Over the centuries there has been a slow growth in the development of technology. While advances such as the invention of the printing press by Johan Gutenberg in 1439, photography by Louis-Jacques-Mandé Daguerre in 1839, the telephone by Alexander Graham Bell in 1876, television by John Logie Baird in 1925, and the atomic bomb by Robert Oppenheimer and his team in 1945, information technology (IT) has also had an interesting history from earliest times up to today. So how and where did IT start? Who were the Gutenbergs and Bells of IT? Let’s take a look at the history of technology and see how we got where we are today.
Ever since Stone Age man first used pebbles, hash marks on bones and on walls, even their fingers, to count - we have been searching for ways to make calculations mechanically such as adding and subtracting. One of the first computing devices was the abacus - a calculating tool used for performing arithmetic. The ancient Greeks used the abacus as long ago as the 5th century BC, and it is still in use today by merchants and traders in some Asian and African countries for counting. The Chinese developed an abacus called the suanpan in the 14th century AD with which it is possible to multiply and divide as well as to add and subtract. In 1617, John Napier - a Scotsman, developed a new version of the abacus, called Napier's Bones, which could be used for more difficult calculations such as square roots. Napier is also known as the inventor of logarithmic tables – well loved by Maths students in the pre-calculator days.
In 1645, the French philosopher and mathematician - Blaise Pascal, invented the first digital calculator called the Pascaline. This device could add and subtract by turning dials on the machine's face. We can all be grateful to Pascal as the reason he invented this machine was to help his father with his work collecting taxes! In 1671, the German philosopher and mathematician, Gottfried Leibnitz, discovered the binary system which is the basis of virtually all modern computer architectures. Leibnitz also developed a machine, called the Stepped Reckoner, which was the first calculator to perform all four arithmetic calculations - add, subtract, multiply, and divide. Calculators with mechanisms like this were used for the next 300 years – even up to the1970's.
An Englishman, Charles Babbage, is recognized today as the Father of Computers - he was the first person to think of the concept of a programmable computer. In 1822 he began work on a Difference Engine that could calculate a series of values automatically. Though this was a very large machine weighing over 13 tonnes and standing over two meters high, it was never completed. A replica of the Difference Engine was built in 1991 by the British Science Museum which could give results of calculations up to 31 digits - more than a modern pocket calculator. Lady Ada Lovelace, daughter of the famous poet Lord Byron, worked with Babbage and wrote some instructions (programs) for his machines. For this, she is regarded as the first computer programmer - though they didn't know it at the time, this was the first combination of hardware and software.
The first person to make money out of computing was, not surprisingly, an American - Herman Hollerith, who was a statistician with the US Census Bureau. Counting for the 1880 US census (population 50,189,209) was not complete until 1888, and by law a census had to be conducted every ten years. As the US population was experiencing rapid growth, Hollerith estimated that the 1890 census would take more than 10 years to complete. In 1884 he invented the Census Counting Machine which used punched cards to collect the census data, these cards were then fed into a card reader to count and record the results. With the introduction of this technology, the 1890 census (population 62,947,714) took just six weeks to process even though a lot more data were collected than ever before. Hollerith set up a company, the Tabulating Machine Company, and many other countries used his technology in their censuses. This included the 1911 census of the UK and Ireland which was made available on-line by the National Archives of Ireland last year. In 1911, Hollerith's company merged with three other companies to form the Computing Tabulating Recording Corporation, which in 1924 was renamed to International Business Machines (IBM) - now the largest computer company in the world.
The real dawn of modern computing dates back to 1944 when the Harvard Mark I became the first fully automatic computer to be completed. The lead engineer behind the computer was the American Howard Aiken who was inspired by Charles Babbage’s Difference Engine. This computer had mechanical relays, or switches which flip-flopped back and forth to represent mathematical data. It was an enormous computer - 16 meters long and 2.5 meters high, weighting 4.5 tonnes and had over 500 miles of wiring. It could store up to 72 numbers, each 23 decimal digits long, and could add three numbers in a second. It was while working on the next generation Mark II version that a moth was discovered trapped in a relay causing a malfunction – hence the origin of the term “computer bug”. Aiken is reputed to have said in 1947 “Only six electronic digital computers would be required to satisfy the computing needs of the entire United States” – how wrong can you be.
War was to play a major part in the development of technology. During World War II, an Englishman, Alan Turing devised machines, including the Bletchley Park Colossus, which could decode encrypted German messages created on an the Enigma machine. This was the world's first programmable, digital, electronic, computing device. It used vacuum tubes to perform the calculations. Turing is regarded as the Father of Modern Computer Science and went on to develop a test for artificial intelligence, and was also the first person to write a chess program for a computer. Bletchley Park, which is now a museum that includes a replica of the Colossus, was recently in the news as some American companies, including IBM, donated over €70,000 to help keep the museum open.
In 1946, two Americans – John Eckert and John Mauchly, created the ENIAC for the US Department of Defense to be used in calculations for artillery fire. Even though this machine weighed over 30 tonnes and had 18,000 vacuum tubes, it only had the computing power of little more than the modern calculator. The ENIAC was programmed by rewiring the machine, instruction by instruction, by women programmers who were called “computers”. The world's first successful commercially available computer was the Ferranti Mark I which was built in the UK in 1951. The first commercial computer in the US was the UNIVAC I built by Eckert and Mauchly – the first one was delivered to the US Census Bureau in March 1951. The fifth UNIVAC I was used by CBS to predict the result of the 1952 Presidential Election - with a sample of just 1% of voters it predicted that Eisenhower would beat Stevenson.
By the 1950’s, vacuum tube technology was reaching its limits – the tubes were highly inefficient, needed a lot of space, and had to be replaced often. Transistors had already been invented in 1947 by Americans John Bardeen and Walter Brattain (who won a Nobel Prize for their invention in 1956). In 1958, another American, Jack St. Clair Kilby invented the integrated circuit which was a device that allowed the placement of many transistors into a small area – we now know this device as a computer chip. Kilby had to wait until 2000 before he too won a Nobel Prize. This chip revolutionized modern electronics – now computers could be a lot smaller and faster.
The first microprocessor was introduced by Intel in 1971 – up until now, computers could only be afforded by large organizations. However, it was not until 1977 that the first popular microcomputer, the Apple II, became available. It was also commercially successful selling over two million units between 1977 and 1993. The 12th of August, 1981, is a hugely significant date in the development of modern information technology. On this date IBM introduced the PC – the Personal Computer. It had an Intel 8088 processor with a speed of 4.77 MHz and up to 640 KB of memory. It had no hard disk and required two floppy disks – one for applications, and the other for data. Many other companies cloned the IBM PC. The launch of the PC was such a momentous event that Time Magazine named the Personal Computer as Machine of the Year for 1982. Sadly, many of the original design team of the IBM PC died in the 1985 Dallas Air Disaster.
Crucial to the success of the personal computer was the operating system used, MS-DOS. In 1980, IBM had approached a then little known college drop-out, Bill Gates, to provide an operating system for the IBM PC. Gates licensed the operating system to IBM and its clones. Gates had set up a company in 1976 to develop programmes in the BASIC language – from this modest beginning grew the mighty Microsoft Corporation. Software was now becoming just as important as hardware. WordStar was one of the first word processing applications for the PC and was popular up until the early 1990s. It is no longer developed, though curiously it is now reputedly owned by the Irish e-Learning company Riverdeep.
The 1980s was also a time of rivalry. In 1884 Apple launched the Apple Macintosh computer in direct opposition to the PC to much fanfare with a famous ad shown at half-time in the 1984 Super Bowl which depicted IBM as Big Brother. The ad, costing $1.6 million, was directed by Ridley Scott. The ad ran only once, though it is now widely available on YouTube. Apple also developed a rivalry with Microsoft. When Microsoft Windows was introduced in 1985, Apple sued for infringement of copyright of “visual displays” – the lawsuit was finally settled in Microsoft’s favour in 1993.
But perhaps the most outstanding release of the 1990s was the launch of the World Wide Web by Tim Berners-Lee in 1992. This was closely followed in 1993 by the release of Mosaic, a web browser which allowed people to access content on the Web. By 1995, company’s world-wide had started to create a Web presence leading to the dot.com boom of 1995-2001. During this time another rivalry involving Microsoft erupted – the so-called “Browser Wars” with Netscape. In 1996, Netscape dominated with 80% of all browsers used. By 2001, Microsoft’s Internet Explorer had a 90% market share. Microsoft had won again.
The World Wide Web too has evolved. Up until 2004, the Web was essentially an information retrieval only service for most people that required specialist knowledge in order to create Web content. Now anyone can set up a web site, create a blog, and publish videos, photographs, or anything they want on the Web, the so-called “Web 2.0”. Social networking, file-sharing, and blogging are now common place – much of this was almost impossible for most people to do as little as four years ago.
Today, IT is continues to grow rapidly. Based on Moore’s Law, processing speed, memory capacity, and even the number of pixels in digital cameras is doubling every two years. The computer that this article is written on is 630 times faster than the original IBM PC, and has 3,125 times more memory capacity. If the history of technology tells us anything – we ain’t seen nothing yet!
Dr. Eugene F.M. O’Loughlin is a Lecturer in Computing at the National College of Ireland.