The History of Computing


[Music] Hi, thanks for tuning into Singularity Prosperity. This video is the first in a multi-part series discussing computing, in this video we’ll be discussing the evolution of computing – more specifically, the evolution of the technologies that have brought upon the modern computing era. The purpose of this video is so we can appreciate how fast technology is evolving and the people who have brought us to this point! Many inventions have taken several centuries to develop into their modern forms and modern inventions are rarely the product of a single inventors efforts. The computer is no different, the bits and pieces of the computer, both hardware and software, have come together over many centuries, with many people and groups each adding a small contribution. We start as early as 3000 BC with the Chinese abacus, how is this related to computing you ask? The abacus was one of the first machines humans had ever created to be used for counting and calculating. Fast forward to 1642 and the abacus evolves into the first mechanical adding machine, built by mathematician and scientist, Blaise Pascal. This first mechanical calculator, the Pascaline, is also where we see the first signs of technophobia emerging, with mathematicians fearing the loss of their jobs due to progress. Also in the 1600s, from the 1660s to the early 1700s, we meet Gottfried Leibniz. A pioneer in many fields, most notably known for his contributions to mathematics and considered by many the first computer scientist. Inspired by Pascal he created his own calculating machine, able to perform all four arithmetic operations. He was also the first to lay down the concepts of binary arithmetic, how all technology now days communicates and even envisioned a machine that used binary arithmetic. From birth we are taught how to do arithmetic in base 10 and for most people that’s all they’re concerned with, the numbers 0 to 9. However, there are an infinite number of ways to represent information, such as octal as base 8, hexadecimal as base 16 used represent colors, base 256 which is used for encoding, the list can go on. Binary is base 2, represented by the numbers 0 & 1, we’ll explore later in this video, why binary is essential for modern computing. Back on topic, progressing to the 1800s we are met with Charles Babbage. Babbage is known as the father of the computer, with the design of his mechanical calculating engines. In 1820, Babbage noticed that many computations consisted of operations that were regularly repeated and theorized that these operations could be done automatically. This led to his first design, the difference engine, it would have a fixed instruction set, be fully automatic through the use of steam power and print its results into a table. In 1830, Babbage stopped work on his difference engine to pursue his second idea, the analytical engine. Elaborating on the difference engine this machine would be able to execute operations in non-numeric orders through the addition of conditional control, store memory and read instructions from punch cards, essentially making it a programmable mechanical computer. Unfortunately due to lack of funding his designs never came to reality, but if they had would have sped up the invention of the computer by nearly 100 years. Also worth mentioning is Ada Lovelace, who worked very closely with Babbage. She is considered the world’s first programmer and came up with an algorithm that would calculate Bernoulli numbers that was designed to work with Babbage’s machine. She also outlined many fundamentals of programming such as, data analysis, looping and memory addressing. 10 years prior to the turn of the century, with inspiration from Babbage, American inventor Herman Hollerith designed one of the first successful electromechanical machines, referred to as the census tabulator. This machine would read U.S. census data from punched cards, up to 65 at a time, and tally up the results. Hollerith’s tabulator became so successful he went on to found his own firm to market the device, this company eventually became IBM. To briefly explain how punched cards work, essentially once fed into the machine an electrical connection is attempted to be made. Depending on where the holes in the card are will determine your input based on what connections are completed. To input data to the punched card you could use a key punch machine aka the first iteration of a keyboard! The 1800s were a period where the theory of computing began to evolve and machines started to be used for calculations, but the 1900s is where we begin to see the pieces of this nearly 5,000 year puzzle coming together, especially between 1930 to 1950. In 1936, Alan Turing proposed the concept of a universal machine, later to be dubbed the Turing machine, capable of computing anything that is computable. Up to this point, machines were only able to do certain tasks that the hardware was designed for. The concept of the modern computer is largely based off Turings ideas. Also starting in 1936, German engineer, Konrad Zuse, invented the world’s first programmable computer. This device read instructions from punched tape and was the first computer to use boolean logic and binary to make decisions, through the use of relays. For reference, boolean logic is simply logic that results in either a true or false output, or when corresponding to binary, one or zero. We’ll be diving into boolean logic deeper later in this video. Zuse would later use punched cards to encode information in binary, essentially making them the first data storage and memory devices. In 1942, with the computer the Z4, Zuse also released the world’s first commercial computer. For these reasons many consider Zuse the inventor of the modern-day computer. In 1937, Howard Aiken with his colleagues at Harvard and in collaboration with IBM began work on the, Harvard Mark 1 Calculating Machine, a programmable calculator and inspired by Babbage’s analytical engine. This machine was composed of nearly 1 million parts, had over 500 miles of wiring and weighed nearly 5 tons! The Mark 1 had 60 sets of 24 switches for manual data entry and could store 72 numbers, each 23 decimal digits. It could do 3 additions or subtractions in a second, a multiplication took 6 seconds, a division took 15.3 seconds and a logarithm or trig function took about 1 minute. As a funny side note, one of the primary programmers of the Mark 1, Grace Hopper, discovered the first computer bug, a dead moth blocking one of the reading holes of the machine. Hopper is also credited with coining the word debugging! The vacuum tube era marks the beginning of modern computing. The first technology that was fully digital, and unlike the relays used in previous computers, were less power-hungry, faster and more reliable. Beginning in 1937 and completing in 1942, the first digital computer was built by John Atanasoff and his graduate student Clifford Berry, the computer was dubbed the ABC. Unlike previously built computers like those built by Zuse, the ABC was purely digital – it used vacuum tubes and included binary math and boolean logic to solve up to 29 equations at a time. In 1943, the Colossus was built in collaboration with Alan Turing, to assist in breaking German crypto codes, not to be confused with Turing’s bombe that actually solved Enigma. This computer was fully digital as well, but unlike the ABC was fully programmable, making it the first fully programmable digital computer. Completing construction in 1946, the Electrical Numerical Integrator and Computer aka the ENIAC was completed. Composed of nearly 18,000 vacuum tubes and large enough to fill an entire room, the ENIAC is considered the first successful high-speed electronic digital computer. It was somewhat programmable, but like Aikens Mark 1 was a pain to rewire every time the instruction set had to be changed. The ENIAC essentially took the concepts from Atanasoff’s ABC and elaborated on them in a much larger scale. Meanwhile the ENIAC was under construction, in 1945, mathematician John von Neumann, contributed a new understanding of how computers should be organized and built, further elaborating on Turing’s theories and bringing clarity to the idea from computer memory and addressing. He elaborated on conditional addressing or subroutines, something Babbage had envisioned for his analytical engine nearly 100 years earlier. Also the idea that instructions or the program running on a computer could be modified in the same way as data, and to code them in binary. Von Neumann assisted in the design of the ENIACs successor, the Electronic Discrete Variable Automatic Computer aka the EDVAC, which was completed in 1950 and the first stored-program computer. It was able to operate over 1,000 instructions per second. He is also credited with being the father of computer virology with his design of a self reproducing computer program. And it contains essentially those things which the modern computer has in it, although in somewhat primitive form. This machine has the stored program concept as its major feature, and that in fact is the thing which makes the modern computer revolution possible! At this point you can see that computing had officially evolved into its own field: From mechanical, to electromechanical relays that took milliseconds to digital vacuum tubes that took only microseconds. From binary as a way to encode information with punched cards, to being used with boolean logic and represented by physical technologies like relays and vacuum tubes to finally being used to store instructions and programs. From the abacus as a way to count, Pascal’s mechanical calculator, the theories of Leibniz, Alan Turing and John von Neumann, the vision of Babbage and the intellect of Lovelace, George Bools contribution of boolean logic, the progressing inventions of a programmable calculator to a stored-program fully digital computer and countless other inventions, individuals and groups. Each step a further accumulation of knowledge – while the title of the inventor of the computer may be given to an individual or group, it was really a joint contribution over 5,000 years and more so between 1800 to 1950. Vacuum tubes were a huge improvement over relays, but they still didn’t make economic sense in a large scale. For example, of the ENIACs 18000 tubes, roughly 50 would burn out per day and a round the clock team of technicians would be needed to replace them. Vacuum tubes were also the reason why computers took up the space of entire rooms, weighed multiple tons and consumed enough energy to power a small town! In 1947, the first silicon transistor was invented at Bell Labs and by 1954 the first transistorized digital computer was invented, aka the TRADIC. It was composed of 800 transistors, took the space of .085 cubic meters compared to the 28 the ENIAC took up, only took 100 watts of power and could perform 1 million operations per second. Also during this era, we begin to see major introductions on both the hardware and software aspect of computing. On the hardware side, the first memory device, the random-access magnetic core store, was introduced in 1951 by Jay Forrester, in other words, the beginnings of what is now known as RAM today. The first hard drive was introduced by IBM in 1957, it weighed one ton and could store five megabytes, costing approximately 27,000 dollars per month in today’s money. On the software side is where a lot of major innovations and breakthroughs began to come, this because computer hardware and architecture was beginning to become more standardized instead of everyone working on different variations of a computing machine. Assembly was the first programming language to be introduced in 1949 but really started taking off in this era of computing. Assembly was a way to communicate with the machine in pseudo-English instead of machine language aka binary. The first true widely used programming language was Fortran invented by John Backus at IBM in 1954. Assembly is a low-level language and Fortran is a high-level language. In low-level languages while you aren’t writing instructions in machine code, a very deep understanding of computer architecture and instructions is still required to execute a desired program, which means a limited number of people have the skills and it is very error-prone. Also in the early to mid 50s, to compile code back to machine code was still an expensive and time-consuming process. This all changed with Grace Hopper and her development of the first computer compiler, Hopper if you remember from earlier also found the first computer ‘bug’. This allowed for programming of computers to become more affordable and nearly instantaneous, instead of the time-consuming process of writing code in assembly and then manually converting it back to machine code. As a side note, Hopper also assisted with the invention of and other early programming language, Cobol. This era marks the beginnings of the modern computing era and where the exponential trend of computing performance really began. While transistors were a major improvement over vacuum tubes, they still had to be individually soldered together. As a result, the more complex computers became, led to more complicated and numerous connections between transistors, increasing the likelihood of faulty wiring. In 1958, this all changed with Jack Kilby of Texas Instruments and his invention of the integrated circuit. The integrated circuit was a way to pack many transistors onto a single chip, instead of individually wiring transistors. Packing all the transistors also significantly reduced the power and heat consumption of computers once again and made them significantly more economically feasible to design and buy. Integrated circuits sparked a hardware revolution and beyond computers assisted in the development of various other electronic devices due to miniaturization, such as the mouse invented by Douglas Engelbart in 1964, he also demonstrated the first graphical user interface as a side note. Computer speed, performance, memory and storage also began to iteratively increases as ICs could pack more transistors into smaller surface areas. This demonstrated by the invention of the floppy disk in 1971 by IBM and in the same year, DRAM by Intel, to list a few. Along with hardware, further advances in software were made as well, with an explosion of programming languages and the introduction of some of the most common languages today: BASIC in 1964 and C in 1971. As you can see from throughout this video, computing since the 1900s has evolved at an increasingly fast rate. Thus, in 1965, led Gordon Moore, one of the founders of Intel, to make one of the greatest predictions in human history: Computing power would double every two years at low cost, and that computers would eventually be so small that they could be embedded into homes, cars and what he referred to as personal portable communications equipment, aka mobile phones. We now refer to this as Moore’s Law. Here are some charts to further illustrate how fast computing was evolving and what Moore based his predictions on: One of my colleagues called this Moore’s Law. Rather than just being something that chronicles the progress of the industry, it kind of became something that drove the progress of the industry. A tremendous amount of engineering and commitment has been required to make that happen, but much to my surprise the industry has been able to keep up with the projection! [Music] At this point the video has come to a conclusion, I’d like to thank you for taking the time to watch it. If you enjoyed it please leave a thumbs up and if you want me to elaborate on any of the topics discussed or have any topic suggestions, please leave them in the comments below. Consider subscribing to my channel for more content, follow my Medium publication for accompanying blogs and like my Facebook page for more bite-sized chunks of content! This has been Ankur, you’ve been watching Singularity Prosperity and I’ll see you again soon! [Music]

Comments 59

  • Nicely done!

  • nice pace and excellent video,eagerly waiting for you to make videos on future computing technologies like Quantum computing (various methodologies of quantum calculation), dna computing etc and also comparing computing speeds and possibly their applications and how they will change our future

  • Pretty good video for an up and coming channel!

  • Thanks, PC Principal.

  • Great video! Glad i stumbled across it!

  • Plzzzzz…explain how computer can read program nd shows the result with a simple task like addition command no one provides this basic impotant information

  • New favorite channel. Keep em coming !

  • Ok Lovelace didn't do any of that s*** she was a secretary I don't know where you guys get this crap

  • only mistake i could find after an entire evening watching all your videos, love them by yhe way! I think you mention that Zuse's first computer used relays. If i remember correctly, Zuse's Z1 machine in 1936 was purely mechanical, no relays or vacume tubes. Very much like the analytical engine in that respect but Zuse used binary mechanical logic gates of his own design. The machine ran off an electric motor to act as the clock and give mechanical motion to his gates. It is considered the most complex mechanical device ever constructed with over a quarter million different moving parts… not to mention the dude built it by himself pretty much, in his parents apartment's living room, FREAKEN BADASS if you ask me! XD

  • Nice, but how do you find those historical videos in decent quality?!

  • Nice film, good work, thank you !

  • Become a YouTube member for many exclusive perks from early previews, bonus content, shoutouts and more! https://www.youtube.com/c/singularityprosperity/join – AND – Join our Discord server for much better community discussions! https://discordapp.com/invite/HFUw8eM

  • Moore's law: what if the industry didn't see it as a prediction, but as a rule? There are competitors, and new chips will require high investments… What if the captains of industry together made agreements about growth and risk reduction? That puts Moore's law in a new light … Could it be that intentionally too small steps have been taken to generate more sales?

  • toomuch base in thhe sound….

  • Sweet video! Was utterly surprised when i clicked out of fullscreen and saw the amount of views, likes and subsrcribers. Keep it up man i'll spread the word!

  • Your speech rate is too fast. But otherwise an excellent video! Well done sir!

  • great video! hope you'll get more attention – got my sub already

  • Erutmatic

  • great vid

  • Forgot russian inventors of first computer( as we understand it today) in city of Omsk, Russia. And other grear russian engineers who made all this happen. Well anglo-saxons and japanees just stolen

  • And as you read comments below. The credit that authors of this video gave to some of the people they mentioned had been exaggerated.

  • In 1968 , soviet engineer-electromechanic Arsenij Anatolievich Gorohov invented a machine wich was called a " Device to task a program to reproduce a contour of a component". Thats how it was named in authors sertificate, patent Nr 383005 on may 18th of the year 1968. Inventor himself called it a "Programable device intellector". The Intellector had a monitor, stand-alone system block with hard disk drive , a device for solving autonomous problems and personal dialogue with computer, motherboard, memory, GPU and other , excluding computer mouse.

  • The inventor never got money from the goverment. The scheme of this invention was published in 1970 in "The ballot of inventions, descoveries and trademarks" , making it accessible to everyone. In 1975 in the USA american businessman Edvard Roberts relieses "Altair 8800" based on Intels microprocessor "Intel 8800" reliesed in 1974. In 1976 Steve Voznyak ant Steve Jobs compile "Altair 8800" in garage and call it "Apple I" for a price of 666.66$ – first personal computer wich excluded metal case, monitor and keyboard. Two years later these parts were included and it was called "Apple II".

  • The first personal computer was invented by regular russian constructor Arsenij Anatolievich Gorohov in Omsk NII of Aviation Technologies in 1968.

  • In 1951 soviet computer "Whirlwind" was the first to output data on the screen. It had 512 bites of memory. And it was as high as two-story building.

  • In 1968 , soviet engineer-electromechanic Arsenij Anatolievich Gorohov invented a machine wich was called a " Device to task a program to reproduce a contour of a component". Thats how it was named in authors sertificate, patent Nr 383005 on may 18th of the year 1968. Inventor himself called it a "Programable device intellector". The Intellector had a monitor, stand-alone system block with hard disk drive , a device for solving autonomous problems and personal dialogue with computer, motherboard, memory, GPU and other , excluding computer mouse.

  • Add placement is painful 😖

  • Leibnitz is pronounced LIPE-nitz. The Leib- rhymes with pipe.

  • A 5-ton device to store 5MB for $27K/month? Where do I sign up?!?

  • how does a transistor actually make a calculation?

  • Great Video man

  • 3:57 "capable of computing anything that is computable" noooo seriously i how wonderful and extraordinary is that !! It's like I'm capable of eating anythinggg that is eatable 🤣 .. no but really nice video i just laughed a lot at that point

  • thanks a lot

  • if you ever go back in time to 1950. don't tell them you got 128 Gb smartphone in your pocket, you will be locked as crazy person

  • keep it up, the youngsters of today need to know where there processing power started that is in their devices today!

  • Keep up the fabulous work, buddy!!

  • hey what the fuck – you didnt mention Robert Noyce and integrated circuits! what the fuck?! Fuck jack-off kilby!

  • I watched this video 3 times, and I am amazed every time. Good choice of music!

  • You left out Shockley.
    Why?

  • 👍👍

  • We owe a lot to Grace Hopper, don't we?

  • Too fast bro ! Also the monotone is a problem too

  • Bro slow down

  • Why did you leave out the Jaquard loom? The jaquard loom, which used punch cards, inspired Babbage to make his machines.

  • Discover Computing from the guidance of the Quran by Dr Zaid Kasim Ghazzawi .😊

  • Nice video, Had to watch on .75 speed 🙂

  • The music is so fit to the topic of this video

  • Really great video. You are a good teacher, but your monotone fast speech makes it hard to follow along and harder to commit to memory, Try using a bit of excitement and inflection in your verbal presentation. Your channel will get TONS of subscribers if you make the presentation more easy to listen to. Great, excellent content though.

  • 0.75 speed is better.

  • If the narrator practiced enunciating syllables and speaking with fluency, this would be highly watchable. Very informative nonetheless.

  • This has helped me understand this better thank u

  • For 5000 years people have been working at a future in which internet porn exists. I salute these heroes.

  • Great content and well organised…but i felt you could improve on your delivery of the information.. Not being critical here but it sounded too monotonous like a male version of Siri that speaks real fast. Other than that youre really good. Your content shows how well youve understood it and hats off to that. Not many people run through history so well 🙂 Cheers

  • why do you talk a bit like a robot?
    no offense

  • Great channel! Keep up the good work.

  • For those of you thinking the narrator's speaking too fast, you can slow the video down. I find 0.75 helps.

    Other than that and the sort of monotone voice you had with a little too much bass, the content is excellent. Well done!

Leave a Reply

Your email address will not be published. Required fields are marked *