Who Invented to the first Computer? | Who is Father of Computer | World First Computer | Charles Babbage History Of First C...

Who Invented to the first Computer? | Who is Father of Computer | World First Computer | Charles Babbage

Who Invented to the first Computer? | Who is Father of Computer | World First Computer | Charles Babbage

History Of First Computer

A computer is a machine that can be used to perform arithmetic or logical operations automatically through computer programming. Modern computers have the ability to follow a generalized set of operations, called programs. These programs enable the computer to perform a wide range of tasks. A "complete" computer can be referred to as a computer system, including hardware, operating system (core software), and peripheral equipment required for "complete" operation. The term can be used to refer to a group of computers that are connected and work together, specifically a computer network or computer cluster.

Computers were born not for entertainment or email but to solve a serious crunch crisis. By 1880, the U.S. population had grown so large that it took more than seven years for the results of the U.S. Census to be on the table. The government tried to do this as soon as possible by increasing the number of computers based on punch cards to take up entire rooms. Today we put more computer energy in our smartphones than was available in these early models. A further brief history of computers is a timeline of how computers evolved from their humble beginnings to today's machines that surf the Internet, play games, and stream multimedia in addition to crunching numbers.

Charles Babbage and mechanical computers 

Before the baggage was the computer man. It was named after people who specialize in creating arithmetic arithmetic - who spent a lot of time performing arithmetic operations, repeating the process over and over and leaving the results of their arithmetic written in tables that were compiled into valuable books. These tables simplified the lives of other experts whose work had been used to perform all sorts of tasks: from artillery officers deciding how to raise an artillery, to tax collectors, to fortune-telling scientists, or to the movements of the stars in heaven.

Thus, at the end of the 17th century, Napoleon appointed Gaspard de Pranny (22 July 1755 - 29 July 1839) and undertook the revolutionary task of creating the most accurate logarithmic and trigonometric tables ever made (decimals between 14 and 29). To refine and simplify the astronomical mathematics of the Paris Observatory and to be able to combine all the measurements made by the French administration. For this huge task, De Pronme had a brilliant idea that the most complex mathematics could be performed by a less qualified human computer in simple mathematical operations. One way to speed up the work and avoid mistakes was to inspire the English polymath Charles Babbage (26 December 1791 - 18 October 1871) to take the next step: replacing human computers with machines.

Babbage was considered by many to be the father of computers because that scene never really tried. His first attempt was the Difference Engine, which he began to build on the principle of finite difference in 1, avoiding multiplication and division, to calculate complex mathematics through a simple series of addition and subtraction. He even built a small calculator that proved his method works but he was not able to build a differential engine to fill those loverithmic and trigonometric tables with accurate data. In 1835, Lady Byron, the mother of Ada Loveless, claimed that it was a functional model limited to both complexity and accuracy - but at the time, Babbage had already exhausted funding from the British government.

Babbages_Analytical_Engine
Babbages_Analytical_Engine


Rather than be disappointed by the shock, mathematician, philosopher, engineer and researcher Charles Babbage doubled. He focused all his energy on developing an analyzer engine, which would be able to do more complex calculations by multiplying and calculating parts, so that it was more ambitious. Again, Babbage never went through the design phase, but because he designed it he started in 1873, probably not the father of computers, but a prophet of exactly what would happen.

First computer inventor
Charles Babbage

The thousands of pages of commentary and diagrams about Babbage's Analyzer Engine include elements and processes common to any modern computer: logical units for performing arithmetic calculations (equivalent to a processor or CPU), instructions, loops and conditionally a control structure branching (like programming language) and data storage on punchcards ( Early versions of memory) This idea he took from the Jacquard machine. Babbage also considered recording the results of the measurements on paper, using an output device that is a precursor to today's printers.

Thomson Brothers and analog computers

In 1872, a year after the death of Charles Babbage, the great physicist William Thomson (Lord Kelvin) invented a machine capable of performing complex calculations and predicted tides at that place. It is considered to be the first analog computer to be honored by his brother James Thomson, who built it in 1876. The latter device is a more advanced and complete version, which manages to solve different equations by integrating using the wheel and disk mechanism.


Lord Kelvin’s harmonic analyser, used for mathematical prediction of tides

However, by the 20th century, H.L. It took several more decades for Hazen and Vannevar Bush to complete the idea of ​​a mechanical analog computer at MIT (Massachusetts Institute of Technology). Between 1928 and 1931, they developed a different analyzer that was really practical because it could be used to solve various problems, and by such criteria it could be considered the first computer.

Turing and Universal Computing Machine

At the moment, these analog machines could replace the human computer in some tasks already and were doing faster and faster calculations, especially when their gears began to be replaced by electronic components. But still they had a serious shortcoming. They were designed to perform one type of calculation, and if they were to be used for another, their gears or circuits would have to be changed. This was the case until 1964 when Alan Turing, a young English student, devised a computer that would solve any problem that could be translated into mathematical language and then reduced to a series of logical operations with binary numbers, in which only two decisions could be made: true or false. The idea was to reduce all things (numbers, letters, pictures, sounds) to zero stars and solve problems in simple steps using one action (one program). The digital computer was born, but for now it was just an imaginary machine.

Zeus and digital computers

Although Turing established the theory of how a computer should be, he was not the first to put it into practice. This honor goes to an engineer who was slow to gain recognition, as his work was done by the Nazi regime during World War II. On May 12, 1941, Conrad Zuse completed the Z3 in Berlin, the first fully functional (programmable and automated) digital computer. As Silicon Valley pioneers would later do, Zuse successfully built the Z in his home workshop, without electronic components, but using telephone relays. The first digital computer was electromechanical, and it was not built in the electronic version because it was not considered "strategically important" by the German government during the war.

On the other side of the war, the Allies emphasized the importance of building electronic computers using thousands of vacuum tubes. The first was the ABC (Atanasoff-Berry Computer), which was launched in the United States in 1942 by John Vincent Atanasoff and Clifford E. Was created by Berry, which could not be programmed or "touring-complete." In Great Britain, meanwhile, two of Alan Turing's colleagues Tommy Flowers and Max Newman, who worked at Nazi Code-Deferring, Blatchley Park built the Colossus, the first electronic, digital, and programmable computer. But Colossus, like ABC, lacked the final details: it was not "Turing-compliant." First worked at the University of Pennsylvania in December. 1945 To study the feasibility of a hydrogen bomb. For other calculations, its "program" must be changed, i.e. a number of cables and switches must be manually generated. ANIAC, John Mochley and J. Designed by Proper Eckart, it covers 167 m2, weighs 30 tons, consumes 150 kilowatts of electricity, and also holds nearly 20,000 vacuum tubes.

ENIAC was soon overtaken by other computers that stored their programs in electronic memories. The vacuum tube was replaced first by transistors and eventually by microchips, which started the computer shortening race. But our digital age, built by that giant machine, the great winner of World War II, began. Had it not been for Konrad Zuse (1910-1995), the decision to rebuild the Z3 destroyed by the 1961 bombing would have been unanimously considered the first true computer in modern history. The replica was displayed here. The Deutsche Museum in Munich, where it is found today. In 1998, Mexican computer scientist Rojas made an in-depth study of Z and succeeded in proving that it could be "Turing-fiction", which was not even considered by his then-dead creator.

f Zuse’s Z3, the first fully programmable and automatic computer

Focusing on this task, Zeus did not know that he had the first universal computing machine in his hand. In fact, he never did his research that way ... So, is Charles Babbage, Konrad Zuse or Lan Lan Turing the inventor of the computer? Was the Z3, Colossus or ANIAC the first modern computer? It depends. The question is just as open today: What makes a machine a computer?

0 comments: