Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
Moore's law is the observation that over the history of computing hardware, the number of transistors on integrated circuits doubles approximately every two years. The period often quoted as "18 months" is due to Intel executive David House, who predicted that period for a doubling in chip performance (being a combination of the effect of [1]a more transistors and their being faster). The law is named after Intel co-founder Gordon E. Moore, who described the trend in his 1965 paper. The paper noted that the number of components in integrated circuits had doubled every year from the invention of the [5] integrated circuit in 1958 until 1965 and predicted that the trend would continue "for at least ten years". His prediction has proven to be uncannily accurate, in part because the law is now used in the semiconductor industry to [6] guide long-term planning and to set targets for research and development.
[2][3][4]
source: http://en.wikipedia.org/wiki/Moore's_law
2. In a computer system, what are the main functions of the central processing unit and random access memory?
CPU- is the hardware within a computer system or smartphone which carries out the instructions of acomputer program by performing the basic arithmetical, logical, and input/output operations of the system. The fundamental operation of most CPUs, regardless of the physical form they take, is to execute a sequence of stored instructions called a program. The program is represented by a series of numbers that are kept in some kind of computer memory. There are four steps that nearly all CPUs use in their operation: fetch, decode, execute, and writeback.
CPU understands instructions, that are written in Assembly programming language. CPU can move data from one memory location to another.
Sources: http://en.wikipedia.org/wiki/Central_processing_unit http://silwen.hubpages.com/hub/What-are-the-basic-functions-of-a-CPU
RAM- RAM is an electronic, or volatile, state. When the computer is off, RAM is empty; when it is on, RAM is capable of receiving and holding a copy of the software instructions and data necessary for processing. The Ram is used for the following purposes: Storage of a copy of the main systems program that controls the general operation of the computer. This copy is loaded into RAM when the computer is turn on; it stays there as long as the computer is on. Temporary storage of a copy of application program instructions to be retrieved by the central processing unit (CPU) for interpretation and execution. Temporary storage of data that has been input from the keyboard or other input device until instructions call for the data to be transferred into the CPU for processing. Temporary storage of data that has been produced as a result of processing until instructions call for the data to be used again in subsequent processing or to be transferred to an output device such as the screen, a printer, or a disk storage device.
Source: http://www.blurtit.com/q248576.html
refers to the lowest abstract or a raw input which when processed or arranged makes meaningful output. It is the group or chunks which represent quantitative and qualitative attributes pertaining to variables. Information is usually the processed outcome of data. More specifically speaking, it is derived from data. Information is a concept and can be used in many domains.
Information is data that has been processed in such a way as to be meaningful to the person who receives it. it is any thing that is communicated. Information
can be a mental stimulus, perception, representation, knowledge, or even an instruction. The examples of data can be facts, analysis, or statistics. In computer terms, symbols, characters, images, or numbers are data. These are the inputs for the system to give a meaningful interpretation. In other words, data in a meaningful form is information.
For example,researchers who conduct market research survey might ask a member of the public to complete questionnaires about a product or a service. These completed questionnaires are data; they are processed and analyze in order to prepare a report on the survey. This resulting report is information.
6. What are the five (5) main systems that made up the evolution of computer applications? Briefly discuss each.
Mainframe Computers
In the early 1960s, mainframe computers began to find their way into large enterprises. These mainframes consisted of extremely large computers that were responsible for all logic, storage, and processing of data. "Dumbterminals" allowed various users to interact with the mainframe. These systems continued in widespread use for more than 30 years, and to some degree, continue to exist today. Architecturally, these were designed at a time when processing power was scarce and expensive; therefore, it was cost effective to centralize all the power onto the server. The clients for the mainframe systems contained virtually no logic because they relied on the server for everything, including the display logic.
Client/Server Computing
With the migration from mainframe to microcomputer, the pendulum swung from one extreme (having all logic on the server) to the other extreme (having all logic on the client). Sensing the imbalance in this, several vendors began to develop a system that could encapsulate all the benefits of the microcomputer as well as those of the mainframe systems. This led to the birth of client/server applications. Client/server applications were frequently written in languages such as Visual Basic or PowerBuilder, and they offered a lot of flexibility to application developers. Interfaces that were very interactive and intuitive could be created and maintained independent of the logic that drove the application functionality. This separation allowed modifications to be made to the user interface (the place in an application where changes are most frequent), without the need to impact business rules or data access. Additionally, by connecting the client to a remote server, it became possible to build systems in which multiple users could share data and application functionality. With business and data access logic centrally located, any changes to these could be made in a single place. Although traditional client/server applications offered tremendous advantages over stand-alone and mainframe applications, they all lacked a distributed client. This meant that for each change that needed to be made to the user interface, the files comprising the client needed to be reinstalled at each workstation, often requiring dynamic link libraries (DLL) files to be updated. The phrase "DLL hell" aptly captured the frustration of many IT professionals whose job it was to keep the client applications current within a business.
The Internet
During the days of the client/server dominance, the U.S. government project ARPANet was renamed "Internet" and started becoming available to businesses as a means to share files across a distributed network. Most of the early protocols of the Internet, such as File Transfer Protocol (FTP) and Gopher, were specifically related to file
sharing. The Hypertext Transfer Protocol (HTTP) followed these and introduced the concept of "hyperlinking" between networked documents. The Internet, in many ways, is like the mainframe systems that predate it, in that an ultra thin client (the browser) is used to display information retrieved by the server. The documents on the server contain all the information to determine how the page will be displayed in the client. Businesses began to embrace the Internet as a means to share documents, and in time, many realized that the distributed nature of the Internet could free them from the DLL hell of their client/server applications. This newfound freedom led to the introduction of the Internet as more than a document-sharing system and introduced the concept of the web-based application. Of course, these web-based applications lacked the richness and usability that was taken for granted in the client/server days.
Source: http://oopas2.uw.hu/ch10lev1sec1.html