History of Computer (Complete)

History of Computer (Complete)

Papers Complete History of Computers from Generation 1 to Generation 5 (Fifth). Historical development of computer technology is the task while still in high school. Hehe, So I found the history of the computer is back and scattered among the files in my hard drive is full. History of Computers, rather than later mushroomed so I better just post on the blog Information Technology.

Based on the development of computer technology, then its development can be divided into 2 parts:

a. Before the year 1940.

b. After the year 1940.

Prior to the 1940

Since time immemorial, the data processing has been performed by humans. Humans also find equipment mechanics and electronics to help human beings in the calculation and data processing in order to get results faster.




Computers that we encounter today is a long evolution of human inventions sejah of yore in the form of mechanical or electronic devices.

Today computers and supporting devices have been included in every aspect of life and work. Computers are now capable of more than ordinary mathematical calculations. Among them is a computer system at the supermarket kassa

that is able to read the code of groceries, a telephone exchange that handles millions of calls and communications, computer networks and the Internet that connects various places in the world. After all of the data processing tool since ancient times till now can we classify into four major categories.


1. Equipment manuals: the data processing equipment is very simple, and most important factor in the use of power hand tool is the use of human
2. Mechanical Equipment: the equipment that has been shaped mechanical
manually driven by hand
3. Electronic Mechanical Equipment: Mechanical Equipment driven automatically by an electronic motor
4. Electronic Equipment: The equipment works in electronic full
Some equipment has been used as calculators and before the discovery
computer:
1. Abacus
Appeared about 5000 years ago diAsiakecil and is still used in some places until today, can be regarded as the beginning of computing machines. This tool allows users to perform calculations using the sliding beads arranged on shelves sebuh.
Parapedagang at that time using the abacus to calculate the trade transaction. Along with the emergence of a pencil and paper, especially in Europe, the abacus lost its importance.
2. Numerical wheel calculator
After almost 12 centuries, came another invention in terms of computing machines. In 1642, Blaise Pascal (1623-1662), who was then 18 years old, found what he called a numerical wheel calculator (numerical wheel calculator) to help his father make tax calculations.
This brass rectangular box called the Pascaline, used eight serrated wheel to add up the numbers to eight digits. This tool is the counter-based number ten. The weakness of this tool is only terbataas to addition.
3. Numerical wheel calculator 2
In 1694, a German mathematician and philosopher, Gottfried Wilhem von Leibniz (1646-1716) to improve Pascaline by creating a machine that can multiply. Just like its predecessor, this mechanical device works by using wheels serrations.
By studying the notes and drawings made by Pascal, Leibniz can fine-tune his instrument.
4. Mechanical Calculator.
Charles Xavier Thomas de Colmar find a machine that can perform the four basic arithmetic functions. Calculator mekanikColmar, arithometer, presented a more practical approach in the calculation because the tool can perform addition, subtraction, multiplication, and division.
With his ability, arithometer widely used until masaPerangDuniaI.Bersama together with Pascal and Leibniz, Colmarmembantu built mechanical computing era.
Beginning of the computer that is actually formed by a British mathematics professor, Charles Babbage (1791-1871). 1812, Babbagememperhatikan natural fit between the mechanical and mathematical machinery: mechanical machines are very good at doing the same tasks repeatedly without mistake; mediocre mathematician requires a simple repetition of the steps tertenu.
These problems grow up to put the machine kemudain mechanics as a tool to answer the needs of mechanics. Babbage's first attempt to address this problem emerged in 1822 when he proposed a machine to perform calculations differensil equation.
The machine was called the Differential Engine. Using steam, the machine can store programs and can perform calculations and print the results automatically.
After working with Differential Machine for ten years, Babbage was suddenly inspired to start making general-purpose computer first, called the Analytical Engine. Babbage's assistant, Augusta Ada King (1815-1842) has an important role in the manufacture of this machine.
He helped revise the plan, seek funding from the British government, and communicate Anlytical Engine specifications to the public. In addition, pemahamanAugusta good about this machine makes it possible to put instructions into the machine and dlam
also makes it the first female programmer. In 1980, the U.S. Defense Department named a language pemrogramandengan namaADAsebagai tribute to him.
In 1889, Herman Hollerith (1860-1929) also applied the card
perforation to perform calculations. His first task is to find a faster way to perform calculations for the United States Census Bureau.
Previous census conducted in 1880 took seven years to complete the calculation. With growing population, the Bureau estimates that it takes ten years to complete the census count.
In the next period, several engineers made other new enemuan p. Vannevar Bush (1890-1974) created a calculator to solve differential equations in 1931.
Machine can solve complex differential equations that had been considered complicated by academics. The machine was very large and heavy because of hundreds of teeth and the shaft is needed to
perform calculations. In 1903, John V. Atanasoff and Clifford Berry tried to make a computer electrically applying Boolean algebra in electrical circuits.
This approach is based on the work of George Boole (1815-1864) in the form of a binary system of algebra, which states that any mathematical equation can be expressed as true or false. By applying the conditions are right and wrong into the electrical circuit in the form of connected-disconnected, Atanasoff danBerrymembuat
The first electric computer in 1940. But those projects stalled karenakehilangan funding sources.
After the 1940
Development of computers after 1940, subdivided into 5 generations.
1. The first-generation computer (1940-1959).
This first generation computers used vacuum tubes to process and store data. He became a fast heat and flammable, therefore ituberibu-thousand vacuum tubes are needed to run the overall operation of the computer.
He also requires a lot of electrical energy that cause electrical disturbances in the surrounding region.
The first generation of computer is 100% electronic and aid experts in solving problems quickly and accurately calculated. Some first-generation computer:
a. ENIAC (Electronic Numerical Integrator And Calculator)
designed by Dr. John Mauchly and Presper Eckert in 1946.
COMPUTER ENIAC
The computer generation is starting to store data that is known as the concept of data storage (stored program concept) proposed by John Von Neuman.
b. EDVAC Computer.
COMPUTER EDVAC
The use of vacuum tubes have also been reduced in the design of computer EDVAC (Electronic Discrete Variable Automatic Computer) in which the calculation process becomes faster than the ENIAC.
c. EDSAC COMPUTER
EDSAC (Electonic Delay Storage Automatic Calculator) introduced the use of mercury (mercury) in a tube for storing data.
COMPUTER EDSAC
d. UNIVAC 1 Computer.
In 1951, Dr. Mauchly and Eckert created the UNIVAC 1 (Universal Automatic Calculator) the first computer used to process the trade data.
2. Second-generation computers (1959 1964)
In 1948, the invention of the transistor greatly influenced the development of computers. Transistors replaced vacuum tubes in televisions, radios, and computers. As a result, the size of electric machines is reduced drastically. The transistor used in computers began in 1956.
Other findings in the form of magnetic-core memory to help the development of second generation computers smaller, faster, more reliable, and more energy efficient than its predecessor. The first machine to utilize this new technology is a supercomputer.
IBM makes supercomputers, Stretch and Sprery-Rand named LARC. These computers, which were developed for atomic energy laboratories, could handle large amounts of data. The machine was very expensive and tend to be too complex for business computing needs, thereby limiting their attractiveness.
There are only two LARC ever installed and used: one at the Lawrence Radiation Labs in Livermore, Calif., and the other in the U.S. Navy Research and Development Center in Washington DC Replacing a second generation of computer machine language with assembly language.
Assembly language is a language that uses abbreviations to replace the binary code. In the early 1960s, began to appear successful second generation computers in business, in universities and in government.
The second generation of computers is a fully computer using transistors. They also have components that can be associated with the computer at this time: a printer, storage disks, memory, operating system, and programs.
DEC PDP-8 COMPUTER
One important example of the computer at this time was the IBM 1401 secaa widely accepted in the industry. In 1965, almost all large businesses use computers to process the second generation of financial information.
Program stored in the computer and programming language that is in it gives flexibility to the computer. Flexibility is increased performance at a reasonable price for business use.
With this concept, computer Dapa tmencetak consumer purchase invoices and then run a product design or calculate payroll.
Some programming languages ​​began to appear at that time. Programming language Common Business-Oriented Language (COBOL) and FORTRAN (Formula Translator) came into common use.
These languages ​​replaced cryptic binary machine code with words, sentences, and math formulas more easily understood by humans.
This allows a person to program and set the computer. Various New types of careers (programmer, analyst, and computer systems expert). Software industry also began to emerge and evolve during this second-generation computers.
3. Third-generation (1964 early 80s)
Although the transistors in many respects the vacuum tube, but transistors generate substantial heat, which could potentially damage the internal parts of a computer. Quartz stone (quartz rock) eliminates this problem.


Jack Kilby, an engineer at Texas Instruments, developed the integrated circuit (IC: integrated circuit) in 1958. IC combined three electronic components in a small silicon disc made of quartz sand. Scientists later managed to fit more components into a single chip called a semiconductor.
The result, computers became ever smaller as more components can be squeezed onto the chip. Other third-generation computer advancement is the use of the operating system (operating system) that allows the machine to run many different programs at once with a central program that monitored and coordinated the memory
4. The fourth-generation computer (early 80s -?)
After IC, the only place to go was down the size of circuits and electrical components. Large Scale Integration (LSI) could fit hundreds of components on a chip. In the 1980's, the Very Large Scale Integration (VLSI) contains thousands of components in a single chip. Ultra-Large Scale Integration (ULSI) increased the number into the millions.
Ability to install so many components in a chip that berukurang half coins encourage lower prices and the size of the computer. It also increased power, efficiency and reliability.
Intel 4004 chip that was made in 1971 to bring progress to the IC by putting all the components of a computer (central processing unit, memory, and control input / output) in a very small chip. Previously, the IC is made to do a certain task specific.
Now, a microprocessor could be manufactured and then programmed to meet all the requirements. Soon, every household devices like microwave ovens, televisions, and automobiles with electronic fuel injection incorporated microprocessors.
Such developments allow ordinary people to use a regular computer. Computers no longer be a dominant big companies or government agencies. In the mid-1970s, computer assemblers offer their computer products to the general public. These computers, called minicomputers, sold with software packages that are easy to use by the layman.
The software is most popular at the time was word processing and spreadsheet programs. In the early 1980s, such as the Atari 2600 video game consumer interest for more sophisticated home computer and can be programmed.
In 1981, IBM introduced the use of Personal Computer (PC) for use in homes, offices, and schools. The number of PCs in use jumped from 2 million units in 1981 to 5.5 million units in 1982.
Ten years later, 65 million PCs in use. Computers continued evolution towards smaller sizes, from computers that are on the table (desktop computer) into a computer that can be inserted into the bag (laptop), or even a computer that can be hand held (palmtop).

 
Traditional computer hardware architecture consists of four main components namely "Processor", "Memory Storage", "Enter" (Input), and "Exodus" (Output). The traditional model is often known as the von-Neumann architecture.
At the beginning, computers are very large so that its components can meet a very large room. The user - a programmer who once gus concurrently became a computer operator - also work inside the computer.
Although large, the system is categorized as a "personal computer" (PC). Anyone who wants to do the computation; must book / queue to get the allocation of time (an average of 30-120 minutes).
If you want to compile Fortran, then the user will first load the Fortran compiler, which is followed by "load" the program and data. The results obtained, usually shaped mold (print-out). Be some problems on the PC system.
For instance, the allocation of orders should be made in advance. If the work completed before the original plan, then the computer system to "idle" / not tergunakan.
Conversely, if the work would be completed later than originally planned, the next potential users must wait until the job is completed. In addition, a Fortran compiler users will be fortunate, if the user had previously used Fortran.
However, if the user previously using Cobol, Fortran users will have to "load". This problem is overcome by combining the compiler users in a group similar to the same batch. The original medium was replaced with a tape punch cards.


Picture 1.2. Von-Neumann computer architecture


Furthermore, there was a separation of duties between programmers and operators. Paraoperator usually exclusively to residents of "glass room" across the computer room. Paraprogramer which is the user (users), access to computers indirectly through the help of the operator.

Parapengguna preparing for a job that consists of application programs, data input, as well as some program control commands.

Commonly used medium is punched cards (punch cards). Each card can hold information of up to 80 characters a line job complete set of cards are then handed over to the operator.


Picture 1.3. Chart A Personal Computer



Development of the Operating System starts from here, by utilizing the batch system (Figure 1.4, "Chart Memory For Simple Batch System Monitor"). The operator collects a similar job-job which is then executed in groups.

For instance, jobs that require a Fortran compiler will be collected into a batch along with other job-job which also requires a Fortran compiler. After a group job is finished, then the next job will run automatically.


Picture 1.4. Chart Memory For Simple Batch System Monitor



In the next development, introduced the concept of Multiprogrammed System. With this system of job-jobs are stored in main memory at the same time and CPU are used interchangeably.
It requires some additional capabilities which are:
providing I / O routines by the system, memory settings to allocate memory on some of Job, the scheduling of the CPU to select the job which will be run, as well as other hardware allocation
(Figure 1.4, "Chart Memory For Simple Batch System Monitor").
Improved information system known as "the time" / "double duty" / "interactive computing" (Time-Sharing System / Multitasking / Interactive Computing). This system, can simultaneously accessed by more than one user. CPU are used interchangeably by the job-job in memory and on disk.
CPU is allocated only on the job in memory and the job was transferred to and from disk. Direct interaction between the user and the computer gave birth to a new concept, which is a reasonable response time sought to avoid waiting too long.
Until the late 1980s, the computer system with the ability of "normal", commonly known as main-frames. Computer system with a much lower capacity (and cheaper) are called "mini computer".
At the beginning, computers are very large so that its components can meet a very large room.
The user - a programmer who once gus concurrently became a computer operator - also work inside the computer.
Although large, the system is categorized as a "personal computer" (PC). Anyone who wants to do the computation; must book / queue to get the allocation of time (an average of 30-120 minutes).
Although large, the system is categorized as a "personal computer" (PC). Anyone who wants to do the computation; must book / queue to get the allocation of time (an average of 30-120 minutes).
If you want to compile Fortran, then the user will first load the Fortran compiler, which is followed by "load" the program and data. The results obtained, usually shaped mold (print-out). Be some problems on the PC system.
If you want to compile Fortran, then the user will first load the Fortran compiler, which is followed by "load" the program and data. The results obtained, usually shaped mold (print-out). Be some problems on the PC system.
For instance, the allocation of orders should be made in advance. If the work completed before the original plan, then the computer system to "idle" / not tergunakan.
Conversely, if the work would be completed later than originally planned, the next potential users must wait until the job is completed. In addition, a Fortran compiler users will be fortunate, if the user had previously used Fortran.
Conversely, computers with capabilities far more sophisticated computer called super (super-computer). CDC 6600 is the first super computer known as the end of the 1960s. But the working principle of the Operating System of all the computer more or less the same.
Classical computers as described above, has only one processor.
The advantage of this system is more easily implemented because it does not need to pay attention to inter-processor synchronization, ease of control of the processor because the system is not protection, teralu complicated, and tend to be cheaper (not economical).
It should be noted that this referred to a single processor is a single processor as the Central Processing Unit (CPU).
A history of the computer generation of one to five. Until there Download Avira Latest 2011. This is emphasized because there are some devices that do have separate processors in devices such as AGP VGA Card, Optical Mouse, and others of the History of Computers.


source : http://informasi-teknologi.com/sejarah-perkembangan-komputer.html


Category Article

What's on Your Mind...

Powered by Blogger.
Related Posts Plugin for WordPress, Blogger...