Sei sulla pagina 1di 30

NAME ROLL NO

: HARIHARAN . T : UCS 11209.


COMPUTER SCIENCE.

SUBJECT : SEMESTER :

TOPIC
Evolution of computer & Components of computer

1|Page

Abacus

The abacus, also called a counting frame, is a calculating tool used primarily in parts of Asia for performing arithmetic processes. Today, abaci are often constructed as a bamboo frame with beads sliding on wires, but originally they were beans or stones moved in grooves in sand or on tablets of wood, stone, or metal. The abacus was in use centuries before the adoption of the written modern numeral system and is still widely used by merchants, traders and clerks in Asia, Africa, and elsewhere. The user of an abacus is called an abacist.

Indian abacus
First century sources, such as the Abhidharmakosa describe the knowledge and use of abacus in India. Around the 5th century, Indian clerks were already finding new ways of recording the contents of the Abacus. Hindu texts used the term shunya(zero) to indicate the empty column on the abacus.

Mesopotamian abacus
The period 27002300 BC saw the first appearance of the Sumerian abacus, a table of successive columns which delimited the successive orders of magnitude of their sexagesimal number system.Some scholars point to a character from the Babylonian cuneiform which may have been derived from a representation of the abacus. It is the belief of Carruccio (and other Old Babylonian scholars) thatOld Babylonians "may have used the abacus for the operations of addition and subtraction; however, this primitive device proved difficult to use for more complex calculations".

2|Page

Napier's bones

Napier's bones is an abacus created by John Napier for calculation of products and quotients of numbers that was based on Arab mathematics and lattice multiplication used by Matrakci Nasuh in the Umdet-ul Hisab and Fibonacci writing in the Liber Abaci. Also called Rabdology (from Greek o [r(h)abdos], "rod" and - [logia], "study"). Napier published his version of rods in a work printed in Edinburgh,Scotland, at the end of 1617 entitled Rabdologi. Using the multiplication tables embedded in the rods, multiplication can be reduced to addition operations and division to subtractions. More advanced use of the rods can even extract square roots. Note that Napier's bones are not the same as logarithms, with which Napier's name is also associated. The abacus consists of a board with a rim; the user places Napier's rods in the rim to conduct multiplication or division. The board's left edge is divided into 9 squares, holding the numbers 1 to 9. The Napier's rods consist of strips of wood, metal or heavy cardboard. Napier's bones are three dimensional, square in cross section, with four different rods engraved on each one. A set of such bones might be enclosed in a convenient carrying case. A rod's surface comprises 9 squares, and each square, except for the top one, comprises two halves divided by a diagonal line. The first square of each rod holds a single digit, and the other squares hold this number's double, triple, quadruple, quintuple, and so on until the last square contains nine times the number in the top square. The digits of each product are written one to each side of the diagonal; numbers less than 10 occupy the lower triangle, with a zero in the top half.A set consists of 10 rods corresponding to digits 0 to 9. The rod 0, although it may look unnecessary, is obviously still needed for multipliers or multiplicands having 0 in them.
3|Page

Slide rule

The slide rule, also known colloquially as a slipstick, is a mechanical analog computer. The slide rule is used primarily for multiplication and division, and also for functions such as roots,logarithms and trigonometry, but is not normally used for addition or subtraction.

Slide rules come in a diverse range of styles and generally appear in a linear or circular form with a standardized set of markings (scales) essential to performing mathematical computations. Slide rules manufactured for specialized fields such as aviation or finance typically feature additional scales that aid in calculations common to that field.

William Oughtred and others developed the slide rule in the 17th century based on the emerging work on logarithms by John Napier. Before the advent of the pocket calculator, it was the most commonly used calculation tool in science and engineering. The use of slide rules continued to grow through the 1950s and 1960s even as digital computing devices were being gradually introduced; but around 1974 the electronic scientific calculator made it largely obsolete and most suppliers left the business.

4|Page

Pascal's calculator

Blaise Pascal invented the mechanical calculator in 1642. He conceived it while trying to help his father who had been assigned the task of reorganizing the tax revenues of the French province of Haute-Normandie ; first called Arithmetic Machine, Pascal's Calculator and later Pascaline, it could add and subtract directly and multiply and divide by repetition. Pascal went through 50 prototypes before presenting his first machine to the public in 1645. He dedicated it to Pierre Sguier, the chancellor of France at the time. He built around twenty more machines during the next decade, often improving on his original design. Nine machines have survived the centuries, most of them being on display in European museums. In 1649 a royal privilege, signed by Louis XIV of France, gave him the exclusivity of the design and manufacturing of calculating machines in France. Its introduction launched the development of mechanical calculators in Europe first and then all over the world, development which culminated, three centuries later, by the invention of themicroprocessor developed for a Busicom calculator in 1971. The mechanical calculator industry owes a lot of its key machines and inventions to the pascaline. First Gottfried Leibniz invented his Leibniz wheels after 1671 while trying to add an automatic multiplication and division feature to the pascaline, then Thomas de Colmar drew his inspiration from Pascal and Leibniz when he designed his arithmometer in 1820, and finally Dorr E. Feltsubstituted the input wheels of the pascaline by columns of keys to invent his comptometer around 1887. The pascaline was also constantly improved upon, especially with the machines of Dr. Roth around 1840, and then with some portable machines until the creation of the first electronic calculators.

5|Page

Gottfried Leibniz

Specifications

Can multiply, devide, add and substract. Mechanical device made of copper and steel. Carriage is performed with a stepped wheel, which mechanism is still in use today.

Chronology

Contrary

to

Pascal, Leibniz (1646-1716)

successfully

introduced

calculator onto the market. It is designed in 1673 but it takes until 1694 to complete. The calculator can add, subtract, multiply, and divide. Wheels are placed at right angles which could be displaced by a special stepping mechanism. The speed of calculation for multiplication or division was acceptable. But like the Pascaline, this calculator required that the operator using the device had to understand how to turn the wheels and know the way of performing calculations with the calculator.

6|Page

Punched card

A punched card, punch card, IBM card, or Hollerith card is a piece of stiff paper that contains digital information represented by the presence or absence of holes in predefined positions. Now an obsolete recording medium, punched cards were widely used throughout the 19th century for controlling textile looms and in the late 19th and early 20th century for operating fairground organs and related instruments. They were used through the 20th century in unit record machines for input, processing, and data storage. Early digital computers used punched cards, often prepared using keypunch machines, as the primary medium for input of both computer programs and data. Some voting machines use punched cards. The early applications of punched cards all used specifically designed card layouts. It wasn't until around 1928 that punched cards and machines were made "general purpose". The rectangular, round, or oval bits of paper punched out are called chad (recently,chads) or chips (in IBM usage). Multicharacter data, such as words or large numbers, were stored in adjacent card columns known as fields. A group of cards is called a deck. One upper corner of each card was usually cut so that cards not oriented correctly, or cards with different corner cuts, could be easily identified. Cards were commonly printed so that the row and column position of a punch could be identified. For some applications printing might have included fields, named and marked by vertical lines, logos, and more.

7|Page

Analytical Engine

Babbage's first attempt at a mechanical computing device, the difference engine, was a special-purpose calculator designed to tabulate logarithms and trigonometric functions by evaluating finite differences to create approximating polynomials. Construction of this machine was never completed; Babbage had conflicts with his chief engineer, Joseph Clement, and ultimately the British government withdrew its funding for the project. During this project he realized that a much more general design, the Analytical Engine, was possible. The input (programs and data) was to be provided to the machine via punched cards, a method being used at the time to direct mechanical looms such as the Jacquard loom. For output, the machine would have a printer, a curve plotter and a bell. The machine would also be able to punch numbers onto cards to be read in later. It employed ordinary base-10 fixed-point arithmetic. There was to be a store (that is, a memory) capable of holding 1,000 numbers of 50 decimal digits each (ca. 20.7 kB). An arithmetical unit (the "mill") would be able to perform all four arithmetic operations, plus comparisons and optionally square roots. Initially it was conceived as a difference engine curved back upon itself, in a generally circular layout, with the long store exiting off to one side.Like the central processing unit (CPU) in a modern computer, the mill would rely upon its own internal procedures, to be stored in the form of pegs inserted into rotating drums called "barrels", to carry out some of the more complex instructions the user's program might specify. The programming language to be employed by users was akin to modern day assembly languages. Loops and conditional branching were possible, and so the language as conceived would have been Turing-complete long before Alan Turing's concept. Three different types of punch cards were used: one for arithmetical operations, one for numerical constants, and one for load and store operations, transferring numbers from the store to the arithmetical unit or back. There were three separate readers for the three types of cards. In 1842, the Italian mathematician Luigi Menabrea, whom Babbage had met while travelling in Italy, wrote a description of the engine in French. In 1843, the description was translated into English and extensively annotated byAda Byron, Countess of Lovelace, who had become interested in the engine ten years earlier. In recognition of her additions to Menabrea's paper, which included a way to calculate Bernoulli numbers using the machine, she has been described as the first computer programmer. The modern computer programming language Ada is named in her honour.

8|Page

Herman Hollerith

1888 Competition
Following the 1880 census, the Census Bureau was collecting more data than it could tabulate. As a result, the agency held a competition in 1888 to find a more efficient method to process and tabulate data. Contestants were asked to process 1880 census data from four areas in St Louis, MO. Whoever captured and processed the data fastest would win a contract for the 1890 census.Three contestants accepted the Census Bureau's challenge. The first two contestants captured the data in 144.5 hours and 100.5 hours. The third contestant, a former Census Bureau employee named Herman Hollerith, completed the data capture process in 72.5 hours.Next, the contestants had to prove that their designs could prepare data for tabulation (i.e., by age category, race, gender, etc.). Two contestants required 44.5 hours and 55.5 hours. Hollerith astounded Census Bureau officials by completing the task in just 5.5 hours!Herman Hollerith's impressive results earned him the contract to process and tabulate 1890 census data. Modified versions of his technology would continue to be used at the Census Bureau until replaced by computers in the 1950s.

Components of the Hollerith Tabulator


Herman Hollerith's tabulator consisted of electrically-operated components that captured and processed census data by "reading" holes on paper punch cards.

Pantograph
To begin tabulating data, census information had to be transferred from the census schedules to paper punch cards using a pantograph. The punch cards measured 3.25 by 7.375 inches and contained 12 rows of 20 columns. (Cards used in later censuses had additional columns to collect more data.) Each position in a row and column corresponded to a specific data entry on the census schedule.Census Bureau clerks using pantographs could prepare approximately 500 cards per day. To operate the mechanism, the operator positioned the punching stylus over the desired hole in a punch card template. Each hole in the template corresponded to a specific demographic category. Pressing the stylus into the template created a punched hole in the paper card that was read by the Hollerith tabulator's card reader.

Card Reader
Each Hollerith tabulator was equiped with a card reading station. The manuallyoperated card reader consisted of two hinged plates operated by a lever (similar to a
9|Page

waffle iron). Clerks opened the reader and positioned a punched card between the plates. Upon closing the plates, spring-loaded metal pins in the upper plate passed through the punched data holes in the cards, through the bottom plate, and into wells of mercury beneath. Pins that passed through the punch card completed an electrical circuit when contacting the mercury below. The completed circuit energized the magnetic dials on the Hollerith tabulator and advanced the counting hands. Upon completion of the electrical cicuit (signaled by the ringing of a bell), the clerk transcibed the data indicated by the dial hands.

Hollerith Tabulator Dials


The 1890 Hollerith tabulators consisted of 40 data-recording dials. Each dial represented a different data item collected during the census. The electrical impulses received as the reader's pins passed through the card into the mercury advanced the hands on the dials corresponding to the data contained on the punch card (i.e., responses to inquiries about race, gender, citizenship, age, etc). When the bell signalled the card had been read, the operator recorded the data on the dials, opened the card reader, removed the punch cards, and reset the dials.

Sorting Table
A sorting table was positioned next to each tabulator. After registering the punch card data on the dials, the sorter specified which drawer the operator should place the card. The clerk opened the reader, placed the punch card in the designated sorter drawer, reset the dials, and positioned a new card to repeat the process. An experienced tabulator clerk could process 80 punch cards per minute.

10 | P a g e

Harvard Mark I

The IBM Automatic Sequence Controlled Calculator (ASCC), called the Mark I by Harvard University, was an electro-mechanical computer. The electromechanical ASCC was devised by Howard H. Aiken, built at IBM and shipped to Harvard in February 1944. It began computations for the U.S. Navy Bureau of Ships in May and was officially presented to the university on August 7, 1944. It was very reliable, much more so than early electronic computers. It has been described as "the beginning of the era of the modern computer" and "the real dawn of the computer age". The ASCC was built from switches, relays, rotating shafts, and clutches. It used 765,000 components and hundreds of miles of wire, comprising a volume of 51 feet (16 m) in length, eight feet (2.4 m) in height, and two feet (~61 cm) deep. It had a weight of about 10,000 pounds (4500 kg). The basic calculating units had to be synchronized mechanically, so they were run by a 50-foot (~15.5 m) shaft driven by a five-horsepower (4 kW) electric motor. From the IBM Archives:The Automatic Sequence Controlled Calculator (Harvard Mark I) was the first operating machine that could execute long computations automatically. A project conceived by Harvard University's Dr. Howard Aiken, the Mark I was built by IBM engineers in Endicott, N.Y. A steel frame 51 feet (16 m) long and eight feet high held the calculator, which consisted of an interlocking panel of small gears, counters, switches and control circuits, all only a few inches in depth. The ASCC used 500 miles (800 km) of wire with three million connections, 3,500 multipole relays with 35,000 contacts, 2,225 counters, 1,464 tenpole switches and tiers of 72 adding machines, each with 23 significant numbers. It was the industry's largest electromechanical calculator. The enclosure for the Mark I was designed by futuristic American industrial designer Norman Bel Geddes. Aiken considered the elaborate case to be a waste of resources, since computing power was in high demand during the war and the funds ($50,000 or more according to Grace Hopper) could have been used to build additional computer equipment.

11 | P a g e

AtanasoffBerry Computer ( ABC )

The AtanasoffBerry Computer (ABC) was the first electronic digital computing device. Conceived in 1937, the machine was not programmable, being designed only to solve systems of linear equations. It was successfully tested in 1942. However, its intermediate result storage mechanism, a paper card writer/reader, was unreliable, and when inventor John Vincent Atanasoff left Iowa State College for World War II assignments, work on the machine was discontinued. The ABC pioneered important elements of modern computing, including binary arithmetic and electronic switching elements, but its special-purpose nature and lack of a changeable, stored program distinguish it from modern computers. The computer was designated an IEEE Milestone in 1990. Atanasoff and Clifford Berry's computer work was not widely known until it was rediscovered in the 1960s, amidst conflicting claims about the first instance of an electronic computer. At that time, the ENIACwas considered to be the first computer in the modern sense, but in 1973 a U.S. District Court invalidated the ENIAC patent and concluded that the ENIAC inventors had derived the subject matter of the electronic digital computer from Atanasoff (see Patent dispute). The machine was, however, the first to implement three critical ideas that are still part of every modern computer: 1. Using binary digits to represent all numbers and data 2. Performing all calculations using electronics rather than wheels, ratchets, or mechanical switches 3. Organizing a system in which computation and memory are separated.

12 | P a g e

ENIAC
( electronic numerical integrator and calculator )

ENIAC ( Electronic Numerical Integrator And Computer) was the first generalpurpose electronic computer. It was a Turing-complete digital computer capable of being reprogrammed to solve a full range of computing problems. ENIAC was designed to calculate artillery firing tables for the United States Army's Ballistic Research Laboratory. When ENIAC was announced in 1946 it was heralded in the press as a "Giant Brain". It boasted speeds one thousand times faster than electro-mechanical machines, a leap in computing power that no single machine has since matched. This mathematical power, coupled with general-purpose programmability, excited scientists and industrialists. The inventors promoted the spread of these new ideas by teaching a series of lectures on computer architecture. The ENIAC's design and construction was financed by the United States Army during World War II. The construction contract was signed on June 5, 1943, and work on the computer began in secret by theUniversity of Pennsylvania's Moore School of Electrical Engineering starting the following month under the code name "Project PX". The completed machine was announced to the public the evening of February 14, 1946 and formally dedicated the next day at the University of Pennsylvania, having cost almost $500,000 (nearly $6 million in 2010, adjusted for inflation). It was formally accepted by the U.S. Army Ordnance Corps in July 1946. ENIAC was shut down on November 9, 1946 for a refurbishment and a memory upgrade, and was transferred to Aberdeen Proving Ground, Maryland in 1947. There, on July 29, 1947, it was turned on and was in continuous operation until 11:45 p.m. on October 2, 1955. ENIAC was conceived and designed by John Mauchly and J. Presper Eckert of the University of Pennsylvania. The team of design engineers assisting the development included Robert F. Shaw (function tables), Chuan Chu (divider/square-rooter), Thomas Kite Sharpless (master programmer), Arthur Burks (multiplier), Harry Huskey (reader/printer) and Jack Davis (accumulators). ENIAC was named an IEEE Milestone in 1987.
13 | P a g e

EDVAC
( electronic discrete variable automatic computer )

EDVAC (Electronic Discrete Variable Automatic Computer) was one of the earliest electronic computers. Unlike its predecessor the ENIAC, it was binary rather than decimal, and was a stored programmachine.

Project origin and plan


ENIAC inventors John Mauchly and J. Presper Eckert proposed the EDVAC's construction in August 1944, and design work for the EDVAC commenced before the ENIAC was fully operational. The design would implement a number of important architectural and logical improvements conceived during the ENIAC's construction and would incorporate a high speed serial access memory. Like the ENIAC, the EDVAC was built for the U.S. Army's Ballistics Research Laboratory at the Aberdeen Proving Ground by the University of Pennsylvania's Moore School of Electrical Engineering. Eckert and Mauchly and the other ENIAC designers were joined by John von Neumann in a consulting role; von Neumann summarized and elaborated upon logical design developments in his 1945 First Draft of a Report on the EDVAC. A contract to build the new computer was signed in April 1946 with an initial budget of US$100,000. The contract named the device the Electronic Discrete Variable Automatic Calculator. The final cost of EDVAC, however, was similar to the ENIAC's, at just under $500,000.

14 | P a g e

EDSAC
( Electronic Delay Storage Automatic Calculator )

Electronic Delay Storage Automatic Calculator (EDSAC) was an early British computer. The machine, having been inspired by John von Neumann's seminal First Draft of a Report on the EDVAC, was constructed by Maurice Wilkes and his team at the University of Cambridge Mathematical Laboratory in England. EDSAC was the first practical storedprogram electronic computer. Later the project was supported by J. Lyons & Co. Ltd., a British firm, who were rewarded with the first commercially applied computer, LEO I, based on the EDSAC design. EDSAC ran its first programs on 6 May 1949, when it calculated a table of squares and a list of prime numbers.

Physical components
As soon as EDSAC was completed, it began serving the University's research needs. None of its components were experimental. It used mercury delay lines for memory, and derated vacuum tubes for logic. Input was via 5-hole punched tape and output was via a teleprinter. Initially registers were limited to an accumulator and a multiplier register. In 1953, David Wheeler, returning from a stay at the University of Illinois, designed an index register as an extension to the original EDSAC hardware.

15 | P a g e

Manchester Mark 1

The Manchester Mark 1 was one of the earliest stored-program computers, developed at the Victoria University of Manchester from the Small-Scale Experimental Machine (SSEM) or "Baby" (operational in June 1948). It was also called the Manchester Automatic Digital Machine, or MADM. Work began in August 1948, and the first version was operational by April 1949; a program written to search forMersenne primes ran error-free for nine hours on the night of 16/17 June 1949.The machine's successful operation was widely reported in the British press, which used the phrase "electronic brain" in describing it to their readers. That description provoked a reaction from the head of the University of Manchester's Department of Neurosurgery, the start of a long-running debate as to whether an electronic computer could ever be truly creative.The Mark 1 was initially developed to provide a computing resource within the university, to allow researchers to gain experience in the practical use of computers, but it very quickly also became a prototype on which the design of Ferranti's commercial version could be based. Development ceased at the end of 1949, and the machine was scrapped towards the end of 1950, replaced in February 1951 by a Ferranti Mark 1, the world's first commercially available general-purpose electronic computer. The computer is especially historically significant because of its pioneering inclusion of index registers, an innovation which made it easier for a program to read sequentially through an array of words in memory. Thirty-four patents resulted from the machine's development, and many of the ideas behind its design were incorporated in subsequent commercial products such as the IBM 701 and 702 as well as the Ferranti Mark 1. The chief designers, Frederic C. Williams and Tom Kilburn, concluded from their experiences with the Mark 1 that computers would be used more in scientific roles than in pure mathematics. In 1951 they started development work on Meg, the Mark 1's successor, which would include a floating point unit.
16 | P a g e

UNIVAC I
( universal automatic computer )

The UNIVAC I (UNIVersal Automatic Computer I) was the first commercial computer produced in the United States. It was designed principally by J. Presper Eckert and John Mauchly, the inventors of the ENIAC. Design work was begun by their company, EckertMauchly Computer Corporation, and was completed after the company had been acquired by Remington Rand. (In the years before successor models of the UNIVAC I appeared, the machine was simply known as "the UNIVAC".)The first UNIVAC was delivered to the United States Census Bureau on March 31, 1951, and was dedicated on June 14 that year.[1] The fifth machine (built for the U.S. Atomic Energy Commission) was used byCBS to predict the result of the 1952 presidential election. With a sample of just 1% of the voting population it correctly predicted that Dwight Eisenhower would win. The UNIVAC I computers were built by Remington Rand's UNIVAC division (successor of the Eckert-Mauchly Computer Corporation, bought by Rand in 1950 which later became part of Sperry, now Unisys).As well as being the first American commercial computer, the UNIVAC I was the first American computer designed at the outset for business and administrative use (i.e., for the fast execution of large numbers of relatively simple arithmetic and data transport operations, as opposed to the complex numerical calculations required by scientific computers). As such the UNIVAC competed directly against punch-card machines (mainly made by IBM), but oddly enough the UNIVAC originally had no means of either reading or punching cards (which initially hindered sales to some companies with large quantities of data on cards, due to potential manual conversion costs). This was corrected by adding offline card processing equipment, the UNIVAC Card to Tape converterand the UNIVAC Tape to Card converter, to transfer data between cards and UNIVAC magnetic tapes. However, the early market share of the UNIVAC I was lower than the Remington Rand Company wished. In an effort to increase market share, the company joined with CBS to have UNIVAC I predict the result of the 1952 Presidential election. UNIVAC I predicted Ike Eisenhower would have a landslide victory over Adlai Stevenson who the pollsters favored. The result for UNIVAC I was a greater public awareness in computing technology.
17 | P a g e

Microprocessor

A microprocessor incorporates the functions of a computer's central processing unit (CPU) on a single integrated circuit , or at most a few integrated circuits. It is a multipurpose, programmable device that accepts digital data as input, processes it according to instructions stored in its memory, and provides results as output. It is an example of sequential digital logic, as it has internal memory. Microprocessors operate on numbers and symbols represented in the binary numeral system. The advent of low-cost computers on integrated circuits has transformed modern society. General-purpose microprocessors inpersonal computers are used for computation, text editing, multimedia display, and communication over the Internet. Many more microprocessors are part of embedded systems, providing digital control of a myriad of objects from appliances to automobiles to cellular phones and industrial process control. During the 1960s, computer processors were constructed out of small and medium-scale ICs each containing from tens to a few hundred transistors. For each computer built, all of these had to be placed and soldered onto printed circuit boards, and often multiple boards would have to be interconnected in a chassis. The large number of discrete logic gates used more electrical power and therefore, produced more heatthan a more integrated design with fewer ICs. The distance that signals had to travel between ICs on the boards limited the speed at which a computer could operate. The integration of a whole CPU onto a single chip or on a few chips greatly reduced the cost of processing power. The integrated circuit processor was produced in large numbers by highly automated processes, so unit cost was low. Single-chip processors increase reliability as there were many fewer electrical connections to fail. As microprocessor designs get faster, the cost of manufacturing a chip (with smaller components built on a semiconductor chip the same size) generally stays the same. Microprocessors integrated into one or a few large-scale ICs the architectures that had previously been implemented using many medium- and small-scale integrated circuits. Continued increases in microprocessor capacity have
18 | P a g e

rendered other forms of computers almost completely obsolete (see history of computing hardware), with one or more microprocessors used in everything from the smallest embedded systems and handheld devices to the largest mainframes and supercomputers. The first microprocessors emerged in the early 1970s and were used for electronic calculators, using binary-coded decimal (BCD) arithmetic on 4bit words. Other embedded uses of 4-bit and 8-bit microprocessors, such as terminals, printers, various kinds of automation etc., followed soon after. Affordable 8-bit microprocessors with 16-bit addressing also led to the first general-purpose microcomputers from the mid-1970s on. Since the early 1970s, the increase in capacity of microprocessors has followed Moore's law, which suggests that the number of transistors that can be fitted onto a chip doubles every two years. Although originally calculated as a doubling every year, Moore later refined the period to two years.[4] It is often incorrectly quoted as a doubling of transistors every 18 months.

Three projects delivered a microprocessor at about the same time: Intel's 4004, Texas Instruments (TI) TMS 1000, and Garrett AiResearch's Central Air Data Computer (CADC).

19 | P a g e

Personal computer

A personal computer (PC) is any general-purpose computer whose size, capabilities, and original sales price make it useful for individuals, and which is intended to be operated directly by an end-user with no intervening computer operator. In contrast, the batch processing or time-sharing models allowed large expensive mainframe systems to be used by many people, usually at the same time. Large data processing systems require a full-time staff to operate efficiently. Software applications for personal computers include, but are not limited to, word processing, spreadsheets, databases, Web browsers and email clients, digital media playback, games, and myriad personal productivity and special-purpose software applications. Modern personal computers often have connections to the Internet, allowing access to the World Wide Web and a wide range of other resources. Personal computers may be connected to a local area network (LAN), either by a cable or a wireless connection. A personal computer may be a desktop computer or a laptop, tablet PC, or a handheld PC. While early PC owners usually had to write their own programs to do anything useful with the machines, today's users have access to a wide range of commercial software and free software, which is provided in ready-to-run or ready-to-compile form. Since the 1980s, Microsoft and Intel have dominated much of the personal computer market, first with MS-DOS and then with the Wintel platform. Alternatives to Windows include Apple's Mac OS X and the open-source Linux OSes. AMD is the major alternative to Intel. Applications and games for PCs are typically developed and distributed independently from the hardware or OS manufacturers, whereas software for many mobile phones and other portable systems is approved and distributed through a centralized online store. In July & August 2011, marketing businesses and journalists started to talk about the 'Post-PC Era', an era where the desktop form factor was being replace with more portable computing such as netbooks, and Tablet PC's.
20 | P a g e

PowerPC 600

The PowerPC 600 family was the first family of PowerPC processors built. They were designed at the Somerset facility in Austin, Texas, jointly funded and staffed by engineers from IBM and Motorola as a part of the AIM alliance. Somerset was opened in 1992 and its goal was to make the first PowerPC processor and then keep designing general purpose PowerPC processors for personal computers. The first incarnation became the PowerPC 601 in 1993, and the second generation soon followed with the PowerPC 603, PowerPC 604 and the 64-bit PowerPC 620. The chip was designed to suit a wide variety applications and had support for external L2 cache and symmetric multiprocessing. It had four functional units, including a floating point unit, an integer unit, a branch unit and a sequencer unit. The processor also included a memory management unit. The integer pipeline was four stages long, the branch pipeline two stages long, the memory pipeline five stages long, and the floating-point pipeline six stages long. First launched in IBM systems in the fall of 1993, it was marketed by IBM as the PPC601 and by Motorola as the MPC601. It operated at speeds ranging from 50 to 80 MHz. It was fabricated using a 0.6 m CMOSprocess with four levels of aluminum interconnect. The die was 121 mm large and contained 2.8 million transistors. The 601 has a 32 kB unified L1 cache, a capacity that was considered large at the time for an on-chip cache. Thanks partly to the large cache it was considered a high performance processor in its segment, outperforming the competing Intel Pentium. The PowerPC 601 was used in the first Power Macintosh computers from Apple, and in a variety of RS/6000 workstations and SMP servers from IBM and Groupe Bull. IBM was the sole manufacturer of the 601 and 601+ microprocessors in its Burlington, Vermont and East Fishkill, New York production facilities. The 601 used the IBM CMOS-4s process and the 601+ used the IBM CMOS-5x process. An extremely small number of these 601 and 601+ processors were relabeled with Motorola logos and part numbers and distributed through Motorola. These facts are somewhat obscured given there are various pictures of the "Motorola MPC601", particularly one specific case of masterful Motorola marketing where the 601 was named one of Time Magazine's 1994 "Products of the Year" with a Motorola marking.

21 | P a g e

Glossary
The first counting device was the abacus, originally from Asia. It worked on a place-value notion meaning that the place of a bead or rock on the apparatus determined how much it was worth. 1600s : John Napier discovers logarithms. Robert Bissaker invents the slide rule which will remain in popular use until 19??. 1642 : Blaise Pascal, a French mathematician and philosopher, invents the first mechanical digital calculator using gears, called the Pascaline. Although this machine could perform addition and subtraction on whole numbers, it was too expensive and only Pascal himself could repare it. 1804 : Joseph Marie Jacquard used punch cards to automate a weaving loom. 1812 : Charles P. Babbage, the "father of the computer", discovered that many long calculations involved many similar, repeated operations. Therefore, he designed a machine, the difference engine which would be steam-powered, fully automatic and commanded by a fixed instruction program. In 1833, Babbage quit working on this machine to concentrate on the analytical engine. 1840s: Augusta Ada. "The first programmer" suggested that a binary system shouled be used for staorage rather than a decimal system. 1850s : George Boole developed Boolean logic which would later be used in the design of computer circuitry. 1890: Dr. Herman Hollerith introduced the first electromechanical, punched-card data-processing machine which was used to compile information for the 1890 U.S. census. Hollerith's tabulator became so successful that he started his own business to market it. His company would eventually become International Business Machines (IBM). 1906 : The vacuum tube is invented by American physicist Lee De Forest. 1939 : Dr. John V. Atanasoff and his assistant Clifford Berry build the first electronic digital computer. Their machine, the Atanasoff-Berry-

22 | P a g e

Computer (ABC) provided the foundation for the advances in electronic digital computers. 1941 : Konrad Zuse (recently deceased in January of 1996), from Germany, introduced the first programmable computer designed to solve complex engineering equations. This machine, called the Z3, was also the first to work on the binary system instead of the decimal system. 1943 : British mathematician Alan Turing developped a hypothetical device, the Turing machine which would be designed to perform logical operation and could read and write. It would presage programmable computers. He also used vacuum technology to build British Colossus, a machine used to counteract the German code scrambling device, Enigma. 1944 : Howard Aiken, in collaboration with engineers from IBM, constructed a large automatic digital sequence-controlled computer called the Harvard Mark I. This computer could handle all four arithmetic opreations, and had special built-in programs for logarithms and trigonometric functions. 1945 : Dr. John von Neumann presented a paper outlining the storedprogram concept. 1947 : The giant ENIAC (Electrical Numerical Integrator and Calculator) machine was developped by John W. Mauchly and J. Presper Eckert, Jr. at the University of Pennsylvania. It used 18, 000 vacuums, punch-card input, weighed thirty tons and occupied a thirty-by-fifty-foot space. It wasn't programmable but was productive from 1946 to 1955 and was used to compute artillery firing tables. That same year, the transistor was invented by William Shockley, John Bardeen and Walter Brattain of Bell Labs. It would rid computers of vacuum tubes and radios. 1949 : Maurice V. Wilkes built the EDSAC (Electronic Delay Storage Automatic Computer), the first stored-program computer. EDVAC (Electronic Discrete Variable Automatic Computer), the second storedprogram computer was built by Mauchly, Eckert, and von Neumann. An Wang developped magnetic-core memory which Jay Forrester would reorganize to be more efficient. 1950 : Turing built the ACE, considered by some to be the first programmable digital computer.

23 | P a g e

GENERATION OF COMPUTER
FIRST-GENERATION: Machines

Even before the ENIAC was finished, Eckert and Mauchly recognized its limitations and
started the design of a stored-program computer, EDVAC. John von Neumann was credited with a widely circulated reportdescribing the EDVAC design in which both the programs and working data were stored in a single, unified store. This basic design, denoted the von Neumann architecture, would serve as the foundation for the worldwide development of ENIAC's successors. In this generation of equipment, temporary or working storage was provided by acoustic delay lines, which used the propagation time of sound through a medium such as liquid mercury (or through a wire) to briefly store data. A series of acoustic pulses is sent along a tube; after a time, as the pulse reached the end of the tube, the circuitry detected whether the pulse represented a 1 or 0 and caused the oscillator to re-send the pulse. Others used Williams tubes, which use the ability of a small cathode-ray tube (CRT) to store and retrieve data as charged areas on the phosphor screen. By 1954, magnetic core memory was rapidly displacing most other forms of temporary storage, and dominated the field through the mid-1970s.EDVAC was the first stored-program computer designed; however it was not the first to run. Eckert and Mauchly left the project and its construction floundered. The first working von Neumann machine was the Manchester "Baby" or Small-Scale Experimental Machine, developed by Frederic C. Williams and Tom Kilburn at the University of Manchester in 1948 as a test bed for the Williams tube; it was followed in 1949 by the Manchester Mark 1 computer, a complete system, using Williams tube andmagnetic drum memory, and introducing index registers. The other contender for the title "first digital stored-program computer" had been EDSAC, designed and constructed at the University of Cambridge. Operational less than one year after the Manchester "Baby", it was also capable of tackling real problems. EDSAC was actually inspired by plans for EDVAC (Electronic Discrete Variable Automatic Computer), the successor to ENIAC; these plans were already in place by the time ENIAC was successfully operational. Unlike ENIAC, which used parallel processing, EDVAC used a single processing unit. This design was simpler and was the first to be implemented in each succeeding wave of miniaturization, and increased reliability. Some view Manchester Mark 1 / EDSAC / EDVAC as the "Eves" from which nearly all current computers derive their architecture. Manchester University's machine became the prototype for the Ferranti Mark 1. The first Ferranti Mark 1 machine was delivered to the University in February, 1951 and at least nine others were sold between 1951 and 1957.The first universal programmable computer in the Soviet Union was created by a team of scientists under direction of Sergei Alekseyevich Lebedev from Kiev Institute of Electrotechnology, Soviet Union (nowUkraine). The computer MESM (, Small Electronic Calculating Machine) became operational in 1950. It had about 6,000 vacuum tubes and consumed 25 kW of power. It could perform approximately 3,000 operations per second. Another early machine was CSIRAC, an Australian design that ran its first test program in 1949. CSIRAC is the oldest computer still in existence and the first to have been used to play digital music.

24 | P a g e

SECOND GENERATION: Transistors

The bipolar transistor was invented in 1947. From 1955 onwards transistors replaced vacuum tubes in computer designs, giving rise to the "second generation" of computers. Initially the only devices available were germanium point-contact transistors, which although less reliable than the vacuum tubes they replaced had the advantage of consuming far less power. The first transistorised computer was built at theUniversity of Manchester and was operational by 1953; a second version was completed there in April 1955. The later machine used 200 transistors and 1,300 solid-state diodes and had a power consumption of 150 watts. However, it still required valves to generate the clock waveforms at 125 kHz and to read and write on the magnetic drum memory, whereas the Harwell CADET operated without any valves by using a lower clock frequency, of 58 kHz when it became operational in February 1955. Problems with the reliability of early batches of point contact and alloyed junction transistors meant that the machine's mean time between failures was about 90 minutes, but this improved once the more reliable bipolar junction transistors became available. Compared to vacuum tubes, transistors have many advantages: they are smaller, and require less power than vacuum tubes, so give off less heat. Silicon junction transistors were much more reliable than vacuum tubes and had longer, indefinite, service life. Transistorized computers could contain tens of thousands of binary logic circuits in a relatively compact space. Transistors greatly reduced computers' size, initial cost, and operating cost. Typically, second-generation computers were composed of large numbers of printed circuit boards such as the IBM Standard Modular System each carrying one to four logic gates or flip-flops.A second generation computer, the IBM 1401, captured about one third of the world market. IBM installed more than ten thousand 1401s between 1960 and 1964.Transistorized electronics improved not only the CPU (Central Processing Unit), but also the peripheral devices. The IBM 350 RAMAC was introduced in 1956 and was the world's first disk drive. The second generation disk data storage units were able to store tens of millions of letters and digits. Next to the fixed disk storage units, connected to the CPU via high-speed data transmission, were removable disk data storage units. A removable disk stack can be easily exchanged with another stack in a few seconds. Even if the removable disks' capacity is smaller than fixed disks, their interchangeability guarantees a nearly unlimited quantity of data close at hand. Magnetic tape provided archival capability for this data, at a lower cost than disk. Many second-generation CPUs delegated peripheral device communications to a secondary processor. For example, while the communication processor controlled card reading and punching, the main CPU executed calculations and binary branch instructions. One databus would bear data between the main CPU and core memory at the CPU's fetchexecute cycle rate, and other databusses would typically serve the peripheral devices. On the PDP-1, the core memory's cycle time was 5 microseconds; consequently most arithmetic instructions took 10 microseconds (100,000 operations per second) because most operations took at least two memory cycles; one for the instruction, one for the operand data fetch.During the second generation remote terminal units (often in the form of teletype machines like a Friden Flexowriter) saw greatly increased use. Telephone connections provided sufficient speed for early remote terminals and allowed hundreds of kilometers separation between remoteterminals and the computing center. Eventually these stand-alone computer networks would be generalized into an interconnected network of networksthe Internet. 25 | P a g e

THIRD GENERATION

The mass increase in the use of computers accelerated with 'Third Generation' computers. These generally relied on Jack Kilby's invention of the integrated circuit (or microchip), starting around 1965. However, the IBMSystem/360 used hybrid circuits, which were solid-state devices interconnected on a substrate with discrete wires.The first integrated circuit was produced in September 1958 but computers using them didn't begin to appear until 1963. Some of their early uses were in embedded systems, notably used by NASA for the Apollo Guidance Computer, by the military in the LGM-30 Minuteman intercontinental ballistic missile, and in the Central Air Data Computer used for flight control in the US Navy's F-14A Tomcat fighter jet.By 1971, the Illiac IV supercomputer, which was the fastest computer in the world for several years, used about a quarter-million small-scale ECL logic gate integrated circuits to make up sixty-four parallel data processors. While large mainframe computers such as the System/360 increased storage and processing abilities, the integrated circuit also allowed development of much smaller computers. The minicomputer was a significant innovation in the 1960s and 1970s. It brought computing power to more people, not only through more convenient physical size but also through broadening the computer vendor field. Digital Equipment Corporation became the number two computer company behind IBM with their popular PDPand VAX computer systems. Smaller, affordable hardware also brought about the development of important new operating systems like Unix.In 1966, Hewlett-Packard entered the general purpose computer business with its HP-2116, offering computing power formerly found only in much larger computers. It supported a wide variety of languages, among them BASIC, ALGOL, and FORTRAN.In 1969, Data General shipped a total of 50,000 Novas at $8000 each. The Nova was one of the first 16-bit minicomputers and led the way toward word lengths that were multiples of the 8-bit byte. It was first to employ medium-scale integration (MSI) circuits from Fairchild Semiconductor, with subsequent models using largescale integrated (LSI) circuits. Also notable was that the entire central processor was contained on one 15-inch printed circuit board. In 1973, the TV Typewriter, designed by Don Lancaster, provided electronics hobbyists with a display of alphanumeric information on an ordinary television set. It used $120 worth of electronics components, as outlined in the September 1973 issue of Radio Electronics magazine. The original design included two memory boards and could generate and store 512 characters as 16 lines of 32 characters. A 90-minute cassette tape provided supplementary storage for about 100 pages of text. His design used minimalistic hardware to generate the timing of the various signals needed to create the TV signal. Clive Sinclair later used the same approach in his legendary Sinclair ZX80.

26 | P a g e

FOUTH GENERATION
The basis of the fourth generation was the invention of the microprocessor by a team at Intel. Unlike third generation minicomputers, which were essentially scaled down versions of mainframe computers, the fourth generation's origins are fundamentally different. Microprocessor-based computers were originally very limited in their computational ability and speed, and were in no way an attempt to downsize the minicomputer. They were addressing an entirely different market. Although processing power and storage capacities have grown beyond all recognition since the 1970s, the underlying technology of large-scale integration (LSI) or very-large-scale integration (VLSI) microchips has remained basically the same, so it is widely regarded that most of today's computers still belong to the fourth generation. Microprocessors

On November 15, 1971, Intel released the world's first commercial microprocessor, the 4004. It was developed for a Japanese calculator company, Busicom, as an alternative to hardwired circuitry, but computers were developed around it, with much of their processing abilities provided by one small microprocessor chip. Coupled with one of Intel's other products the RAM chip, based on an invention by Robert Dennard of IBM, (kilobits of memory on one chip) - the microprocessor allowed fourth generation computers to be smaller and faster than prior computers. The 4004 was only capable of 60,000 instructions per second, but its successors, the Intel 8008, 8080 (used in many computers using the CP/M operating system), and the 8086/8088 family (the IBM personal computer (PC) and compatibles use processors still backwards-compatible with the 8086) brought ever-growing speed and power to the computers. Other producers also made microprocessors which were widely used in microcomputers. Supercomputers

At the other end of the computing spectrum from the microcomputers, the powerful supercomputers of the era also used integrated circuit technology. In 1976 the Cray1 was developed by Seymour Cray, who had left Control Data in 1972 to form his own company. This machine, the first supercomputer to make vector processing practical, had a characteristic horseshoe shape, to speed processing by shortening circuit paths. Vector processing, which uses one instruction to perform the same operation on many arguments, has been a fundamental supercomputer processing method ever since. The Cray-1 could calculate 150 million floating point operations per second (150 megaflops). 85 were shipped at a price of $5 million each. The Cray-1 had a CPU that was mostly constructed of SSI and MSI ECL ICs.

27 | P a g e

Mainframes and Mini computers

Before the introduction of the microprocessor in the early 1970s, computers were generally large, costly systems owned by large institutions: corporations, universities, government agencies, and the like. Userswho were experienced specialistsdid not usually interact with the machine itself, but instead prepared tasks for the computer on off-line equipment, such as card punches. A number of assignments for the computer would be gathered up and processed in batch mode. After the jobs had completed, users could collect the output printouts and punched cards. In some organizations it could take hours or days between submitting a job to the computing center and receiving the output. A more interactive form of computer use developed commercially by the middle 1960s. In a time-sharing system, multiple teletype terminals let many people share the use of one mainframe computer processor. This was common in business applications and in science and engineering. A different model of computer use was foreshadowed by the way in which early, precommercial, experimental computers were used, where one user had exclusive use of a processor. Some of the first computers that might be called "personal" were early minicomputers such as the LINC and PDP-8, and later on VAX and larger minicomputers from Digital Equipment Corporation (DEC), Data General, Prime Computer, and others. They originated as peripheral processors for mainframe computers, taking on some routine tasks and freeing the processor for computation. By today's standards they were physically large (about the size of a refrigerator) and costly (typically tens of thousands of US dollars), and thus were rarely purchased by individuals. However, they were much smaller, less expensive, and generally simpler to operate than the mainframe computers of the time, and thus affordable by individual laboratories and research projects. Minicomputers largely freed these organizations from the batch processing and bureaucracy of a commercial or university computing center. In addition, minicomputers were more interactive than mainframes, and soon had their own operating systems. The minicomputer Xerox Alto (1973) was a landmark step in the development of personal computers, because of its graphical user interface, bit-mapped high resolution screen, large internal and external memory storage, mouse, and special software.

28 | P a g e

Fifth generation

The Fifth Generation Computer Systems project (FGCS) was an initiative by Japan's Ministry of International Trade and Industry, begun in 1982, to create a "fifth generation computer" (see History of computing hardware) which was supposed to perform much calculation using massive parallel processing. It was to be the end result of a massive government/industry research project in Japan during the 1980s. It aimed to create an "epoch-making computer" with supercomputer-like performance and to provide a platform for future developments in artificial intelligence.The term fifth generation was intended to convey the system as being a leap beyond existing machines. Computers using vacuum tubes were called the first generation; transistors and diodes, the second; integrated circuits, the third; and those usingmicroprocessors, the fourth. Whereas previous computer generations had focused on increasing the number of logic elements in a single CPU, the fifth generation, it was widely believed at the time, would instead turn to massive numbers of CPUs for added performance. The project was to create the computer over a ten year period, after which it was considered ended and investment in a new, Sixth Generation project, began. Opinions about its outcome are divided: Either it was a failure, or it was ahead of its time.

Failure

The FGCS Project did not meet with commercial success for reasons similar to the Lisp machine companies and Thinking Machines. The highly parallel computer architecture was eventually surpassed in speed by less specialized hardware (for example, Sun workstations and Intel x86 machines). The project did produce a new generation of promising Japanese researchers. But after the FGCS Project, MITI stopped funding large-scale computer research projects, and the research momentum developed by the FGCS Project dissipated. However MITI/ICOT embarked on a Sixth Generation Project in the 1990s.A primary problem was the choice of concurrent logic programming as the bridge between the parallel computer architecture and the use of logic as a knowledge representation and problem solving language for AI applications. This never happened cleanly; a number of languages were developed, all with their own limitations. In particular, the committed choice feature of concurrent constraint logic programming interfered with the logical semantics of the languages. Another problem was that existing CPU performance quickly pushed through the "obvious" barriers that experts perceived in the 1980s, and the value of parallel computing quickly dropped to the point where it was for some time used only in niche situations. Although a number of workstations of increasing capacity were designed and built over the project's lifespan, they generally found 29 | P a g e

themselves soon outperformed by "off the shelf" units available commercially.The project also suffered from being on the wrong side of the technology curve. During its lifespan, GUIs became mainstream in computers; the internet enabled locally stored databases to become distributed; and even simple research projects provided better real-world results in data mining.[citation needed] Moreover the project found that the promises of logic programming were largely negated by the use of committed choice.[citation needed]At the end of the ten year period the project had spent over 50 billion (about US$400 million at 1992 exchange rates) and was terminated without having met its goals. The workstations had no appeal in a market where general purpose systems could now take over their job and even outrun them. This is parallel to the Lisp machine market, where rule-based systems such as CLIPS could run on generalpurpose computers, making expensive Lisp machines unnecessary. In spite of the possibility of considering the project a failure, many of the approaches envisioned in the Fifth-Generation project, such as logic programming distributed over massive knowledge-bases, are now being re-interpreted in current technologies. TheWeb Ontology Language (OWL) employs several layers of logic-based knowledge representation systems, while many flavors of parallel computing proliferate, including multi-core architectures at the low-end and massively parallel processing at the high end.

Timeline

1982: the FGCS project begins and receives $450,000,000 worth of industry funding and an equal amount of government funding. 1985: the first FGCS hardware known as the Personal Sequential Inference Machine (PSI) and the first version of the Sequential Inference Machine Programming Operating System (SIMPOS) operating system is released. SIMPOS is programmed inKernel Language 0 (KL0), a concurrent Prolog-variant with object oriented extensions.

30 | P a g e

Potrebbero piacerti anche