Sei sulla pagina 1di 2

Assembly Language and Computer Organization/Introduction and Overview What Computers Do In order to explore how computers operate, we must

first arrive at an explanatio n of what computers actually do. No doubt, you have some idea of what computers are capable of. After all, odds are high that you are reading this on a computer right now! If we tried to list all the functions of computers, we would come up with a staggeringly complex array of functions and features. Instead, what we n eed to do is peal back all the layers of functionality and see what's going on i n the background. A Basic Computer If we look at a computer at its most abstract level, what they do is process inf ormation. We could represent this sort of abstract machine as some sort of memor y and some sort of processor. The processor can read and write items in the memo ry according to some list of instructions, while the memory simply remembers the data it has been given. The list of instructions executed by the processor are known as a computer program. The details of how this arrangement is accomplished determines the nature of the computer itself. Using this definition, quite a wi de range of devices can be classified as computers. Most of the devices we call "computers" are, in fact, universal computers. That is, if they were given an in finite amount of memory, they would be able to compute any computable function. The requirements for a universal computer are actually very simple. Basically, a machine is universal if it is able to do the following: 1. Read a value from memory. 2. Based on the read value, determine a new value to write to memory. 3. Based on the read value, determine which instruction to execute next. A machine following these rules is said to be "Turing Complete." A complete defi nition of Turing Completeness is beyond the scope of this book. If you are inter ested, please see the articles on Turing Machines and Turing Completeness. Trying to create programs in terms of the constraints of universal computers wou ld be very difficult. It would be more convenient to create a list of more usefu l instructions such as "add", "subtract", "multiply", and "divide." The list of instructions that the processor can execute is called the machine's instruction set. This instruction set helps to define the behavior of the machine, and furth er define the nature of the programs which will run on the machine. Another defining characteristic of the machine answers the question of "Where do es the program get stored?" You'll note that in our diagram of the basic compute r, we did not show where the program resides. This was done with purpose as answ ering this question helps to define the different types of computers. Breaking d own the components of the system further is the topic of the next section. Computer Architecture Another defining characteristic of computers is the logical layout of the system . This property is called the computer's architecture. Computer architecture des cribes how a machine is logically organized and how its instruction set is actua lly implemented. One of the most important architectural decision made in design ing a computer is how its memory is organized, and how programs are loaded into the machine. In the early history of computers, there was a distinction between stored program and hardwired computers. For example, the first general purpose e lectronic computer, the ENIAC, was programmed by wiring units of the machine to devices called function tables. Later machines, like the EDSAC, stored programs as data in program memory. Of course, stored programs proved to be much more con venient. Rewiring a machine simply takes too long when compared to the process o f simply loading new data into the machine. In the current era of computing, alm ost all computers are stored program computers. In this book, we will be focusin g on stored program computer architectures. Having settled on storing programs as data, we now have to turn our attention to where, and how, programs are stored in memory. In the past, many different solu tions to this problem have been proposed, but today there are two main high leve l variants. These are the Von Neumann architecture, and the Harvard Architecture .

Von Neumann Architecture The Von Neumann Architecture is, by far, the most common architecture in existen ce today. PCs, Macs, and even Android phones are examples of Von Neumann compute rs. The Von Neumann Architecture is a stored program computer which consists of the following components: 1. A CPU (Central Processing Unit) which executes program instructions. 2. A Memory which stores both programs and data. 3. Input and Output Devices The defining characteristic of the Von Neumann Architecture is that the both dat a and instructions are stored in the same memory device. Programs are stored as a collection of binary patterns which the CPU decodes in order to execute instru ctions. Of course, the data are stored as bit patterns in memory as well. The CP U can only distinguish between data and program code by the context in which it encounters the data in memory. The CPU in this architecture begins executing ins tructions at some memory location, and continues to step through instructions. A nything it encounters while stepping through the program is treated as program i nstructions. When the CPU is issued instructions to read or write data in the me mory, the CPU treats the numbers in memory as data. Of course, this also means t hat a Von Neumann machine is perfectly capable of modifying its own program. Computer Arithmetic Representation of Numbers and Basis Binary Numbers Hexadecimal Numbers Bits, Bytes, and Nibbles Arithmetic in Arbitrary Bases

Potrebbero piacerti anche