Sei sulla pagina 1di 28

I.

INTRODUCTION
(April Dannee F. Ninalga)

 Computers and Information Systems


A computer can be defined as an electronic device that has the ability to accept data,
store and execute a program of instructions, perform mathematical and logical
operations on data, and report the results. A computer system has the following
common features regardless of brand, type, or size.
 input and output devices
 primary and secondary storage
 processor and control unit
 peripheral devices
Modern digital computers are all conceptually similar, regardless of size. Nevertheless,
they can be divided into several categories on the basis of cost and performance.
• Mainframe computer - This is a large, expensive machine with the capability of
serving the needs of major business enterprises, government departments,
scientific research establishments, or the like.
• Midrange computer or minicomputer - This is a middle-sized computer that is
capable of supporting the computing needs of smaller organizations or of
managing networks of other computers. It is generallytoo expensive for personal
use, and has capabilities suited to a business,school, or laboratory
• Microcomputer - This is a small computer used in systems for universities,
factories, or research laboratories. Under this category are the following:
o personal computer - a relatively low-cost machine, usually of desktop size
(though laptops are small enough to fit in a briefcase, and palmtops can fit
into a pocket, or even wearable PCs);
o server - computer that is specifically optimized to provide software and
other resources to other computers over a network; and
o server farm - a large group of servers maintained by acommercial vendor
and made available via subscription for electronic commerce and other
activities requiring heavy use of servers.
• Workstation - This is a desktop computer with enhanced graphics,
mathematical, and communications capabilities that make it especially useful to
perform complicated tasks at once. They are ideal for office work.;
• Supercomputers - This is a highly sophisticated and powerful computer that can
perform very complex operations in extreme speed.
• Thin client - This computer functions only when connected to a server.
Information system is not a concept that is purely confined in computers.

An information system (IS) is a set of people, procedures, and resources that collects,
transforms, and disseminates information in an organization. It is a system that accepts

1
data resources as input and process them as information products as output. An
information system can be an organized combination of:
• hardware (physical equipment, machines, media; may be mechanical, electronic,
electrical, magnetic, or optical device)
• software (computer programs and procedures concerned with the operation of
the information system)
• data/information
o Data - streams of raw facts
o Information - processed data
• people (information specialists, librarians, knowledge workers, IT people, etc.)
• communication networks (LAN, client/server networks, internet, intranet, etc.)
A computer-based information system (CBIS) relies on computer hardware and
software for processing and disseminating information. The librarian or information
specialist provides and delivers information systems services, which nowadays is
usually computer-based.

Reference:
Buenrostro, J. (2015). Library and Information Science Reviewer. Retrieved December
10, 2017 from https://www.scribd.com/document/317123232/Library-and-
Information-Science-Reviewer

2
II. HISTORY OF COMPUTING
(April Dannee F. Niñalga)

 Pre-Computer Age and Calculating Machines


The abacus is one of the earliest machines invented over 2000 years ago by Asian
merchants to speed up calculation. It is a simple hand device for recording numbers or
performing simple calculations.
Calculating machines were first introduced in the 17th century. In 1642, the first
calculating machine that can perform addition and subtraction, a precursor of the digital
computer, was devised by the French scientist, mathematician, and philosopher Blaise
Pascal. This device employed a series of ten-toothed wheels, each tooth representing a
digit from 0 to 9. The wheels were connected so that numbers could be added to each
other by advancing the wheels by a correct number of teeth. In the 1670s the German
philosopher and mathematician Gottfried Wilhelm Leibniz improved on this machine by
devising one that could also multiply.
It was in 1820 when the next generation of calculating devices was invented, the
artithometer, by Charles Xavier Thomas of France. It combined the features of the
Leibnitz calculator with newer engineering techniques. The first mechanical calculator
produced in the US was developed in 1972 by Frank S. Baldwin. Improving the Leibnitz
design, it made a much smaller and lighter calculator. The first commercial calculator
that was both a calculating and a listing machine was developed in 1886 by William
Seward Burroughs, an American bank clerk.
 Punched Card Information Processing and the Analytical Engine
The French weaver and inventor Joseph-Marie Jacquard, designed an automatic loom
(Jacquard's loom), which used thin, perforated wooden boards to control the weaving of
complicated cloth designs. The concept of recording data in the form of holes punched
in cards was used in the design of punched card information processing equipment.
Another lesson from Jacquard learned from Jacquard was that work can be performed
automatically if a set of instructions can be given to a machine to direct it in its
operations. This was the fundamental for the development of computers.
During the 1880s the American statistician Herman Hollerith who worked in the US
Bureau of Census, conceived the idea of using perforated cards (punch cards similar to
Jacquard's boards) for processing data. Employing a system that passed punched
cards over electrical contacts, he devised the Hollerith's punched-cards tabulating
machine, which he used to speed up the compilation of statistical information for the
1890 United States census. Hollerith went on to establish the Tabulating Machine
Company to manufacture and market his invention, which IN1911 merged with other
organizations to form the Computing-Tabulating Recording Company.

3
In 1924, after further acquisitions, Computing-Recording-Tabulating Company absorbed
the International Business Machines Corporation (IBM) and assumed that company's
name. Thomas J. Watson, Sr. arrived that same year and began to build the foundering
company into an industrial giant. IBM soon became the country's largest manufacturer
of time clocks and developed and marketed the first electric typewriter. In 1951 the
company entered the computer field. The punched card technology was widely used
until the mid-1950s.
Also in the 19th century, the British mathematician and inventor Charles Babbage
(referred to as the Father of the modern computer) worked out the principles of the
modern digital computer. He conceived a number of machines, such as theDifference
Engine and Analytical engine, the forerunners of the modern computer,that were
designed to handle complicated mathematical problems. One ofBabbage's designs, the
Analytical Engine, had many features of a modern computer. It had an input stream in
the form of a deck of punched cards, a "store" for saving data, a "mill" for arithmetic
operations, and a printer that made a permanent record. Babbage failed to put this idea
into practice, though it may well have been technically possible at that date.
Many historians consider Babbage and his associate, the mathematician Augusta Ada
Byron, Countess of Lovelace and daughter of the poet, Lord Byron, the true pioneers of
the modern digital computer. The latter provided complete details as to exactly how the
analytical engine was to work. Because she described some of the key elements in
computer programming, she was referred to as the "world's first computer programmer".
 Early Computers
Analogue computers began to be built in the late 19th century. Early models calculated
by means of rotating shafts and gears. Numerical approximations of equations too
difficult to solve in any other way were evaluated with such machines. Lord Kelvin built a
mechanical tide predictor that was a specialized analogue computer. During World
Wars I and II, mechanical and, later, electrical analogue computing systems were used
as torpedo course predictors in submarines and as bombsight controllers in aircraft.
Another system was designed to predict spring floods in the Mississippi River basin.
In the United States, a prototype electronic machine had been built as early as 1939, by
John Atanasoff and Clifford Berry, at Iowa State College. This prototype and later
research were completed quietly for the development of the Atanasoff-Berry Computer
(ABC). This is considered as the first electronic computing machine. It could only
perform addition and subtraction, and never became operational because of the
involvement of the inventors in US military efforts in World War II.
In 1944, Howard Aiken completed the MARK I computer (also known as the Automatic
Sequence controlled Calculator), the first electromechanical computer. It can solve
mathematical problems 1,000 times faster than existing machines.

4
The first electronic computer to be made operational was the Electronic Numerical
Integrator and Calculator (ENIAC). It was built in 1946 for the US Army to perform
quickly and accurately the complex calculations that gunners needed to aim their
artillery weapons. ENIAC contained 18,000 vacuum tubes and had a speed of several
hundred multiplications per minute, but originally its program was wired into the
processor and had to be manually altered.
The scientists of the Cambridge University in England designed the world's first
electronic computer that stored its program of instructions, the Electronic Delay Storage
Automatic Calculator (EDSAC). This gave more flexibility in the use of the computer.
Two years after (1951), machines were built with program storage, based on the ideas
of the Hungarian-American mathematician John von Neumann of Pennsylvania
University. The instructions, like the data, were stored within a "memory", freeing the
computer from the speed limitations of the paper-tape reader during execution and
permitting problems to be solved without rewiring the computer. This concept gave birth
to the Electronic Discreet Variable Automatic Computer (EDVAC).
During World War II a team of scientists and mathematicians, working at Bletchley Park,
north of London, created one of the first all-electronic digital computers:Colossus. By
December 1943, Colossus, which incorporated 1,500 vacuum tubes, was operational. It
was used by the team headed by Alan Turing, in the largely successful attempt to crack
German radio messages enciphered in the Enigma code.

Reference:
Buenrostro, J. (2015). Library and Information Science Reviewer. Retrieved December
10, 2017 from https://www.scribd.com/document/317123232/Library-and-
Information-Science-Reviewer

5
III. GENERATION OF COMPUTERS
(Darwin Gonzalodo)

 First Generation of Computers


The first generation of computers (1951-1959) is characterized by use of the vacuum
tube and were very large in size (a mainframe can occupy the whole room).
The first business computer, the Universal Automatic Computer (UNIVAC I), was
developed in 1951. It was invented to improve information processing in business
organizations.
In 1953, IBM produced the first of its computers, the IBM 701-a machine designed to be
mass-produced and easily installed in a customer's building. The success of the 701 led
IBM to manufacture many other machines for commercial data processing. The IBM
650 computer is probably the reason why IBM enjoys such a healthy share of today's
computer market. The sales of IBM 650 were a particularly good indicator of how rapidly
the business world accepted electronic data processing. Initial sales forecasts were
extremely low because the machine was thought to be too expensive, but over 1,800
were eventually made and sold.
The invention of the integrated circuit (IC) by Jack S. Kilbey of Texas Instruments in
1958 is considered as a great invention which changed how the world functions. It is the
heart of all electronic equipment today.
Between 1959 and 1961, (COBOL) was invented by Grace Murray Hopper. It is a
verbose, English-like programming language. Its establishment as a required language
by the United States Department of Defense, its emphasis on data structures, and its
English-like syntax led to its widespread acceptance and usage, especially in business
applications. It is a champion of standardized programming languages that are
hardware independent. COBOL is run in many types of computers by a compiler that is
also designed by Hopper.
 Second Generation of Computers
The invention of the transistor marked the start of second generation of computers (ca.
1954-1964) which were smaller in size (a mainframe can be the size of a closet).
Second generation computers were smaller, faster, and more versatile logical elements
than were possible with vacuum-tube machines. Because transistors use much less
power and have a much longer life, components became smaller, as did inter-
component spacings, and the system became much less expensive to build. The
Honeywell 400 computer is the first in the line op of second generation computers.
In the 1950's and 1960's, only the largest companies could afford the six to seven digit
tags of mainframe computers. Digital Equipment Corporation introduced the PDP-8,
which is generally considered as the first successful transistor-based microcomputer. It

6
was an instant hit and there were tremendous demands from business and scientific
organizations.
 Third Generation of Computers
Even if the first IC was invented earlier during the era of first generation computers, it
was only in late 1960s when it was introduced, making it possible for many transistors to
be fabricated on one silicon substrate, with interconnecting wiresplated in place. The IC
resulted in a further reduction in price, size, and failure rate. This was the start of third
generation computers (mid-1960s to mid 1970s).
Some historians consider the IBM System/360 of computers the single most important
innovation in the history of computers. It was conceived as a family of computers with
upward compatibility, when a company outgrew one model itcould move up to the next
model without worrying about converting its data. This made all previous computers
obsolete.
In 1964, Beginner's All-purpose Symbolic Instruction Code (BASIC). a high-level
programming language, was developed by John Kemeny and Thomas Kurtz
atDartmouth College. BASIC gained its enormous popularity mostly because it can be
learned and used quickly. The language has changed over the years, from a teaching
language into a versatile and powerful language of both business and scientific
applications.
In 1969, two Bell Telephone Labs software engineers, Dennis Ritchie and Ken
Thompson, developed a multi-user computer system named Multics (Multiplexed
Information and Computing Service). They eventually implemented a rudimentary
operating system they named Unics, as a pun of Multics. Somehow, the name became
UNIX. The most notable feature of this operating system is its portability: the operating
system can run in all types of computers, is machine-independent, and supports multi-
user processing, multitasking, and networking. UNIX is used in high-end workstations
and servers. This is written in C language, which was also developed by Ritchie and
Thompson.
 Fourth Generation of Computers
The introduction of large-scale integration of circuitry (more circuits per unit of space) is
the mark of the beginning of fourth generation of computers. The base technology,
though, is still the IC, had significant innovations after two decades have passed. The
computer industry actually experienced a mind-bogging succession of advancements in
the further miniaturization of circuitry, data communications, and the design of computer
hardware and software. The microprocessor became a reality in the mid-1970s with the
introduction of the large-scale integrated (LSI) circuit.
Bill Gates and Paul Allen revolutionized the computer industry. They developed the
BASIC programming language for the first commercially-available microcomputer, the
MITS Altair. After successful completion of the project, the two formed Microsoft

7
Corporation in 1975. Microsoft is now the largest and most influential software company
in the world. Microsoft was given an anonymous boost when its operating system
software, MS-DOS was selected for use by the IBM PC. Gates, now the wealthiest
person in the world, provides the company's vision of new product ideas and
technologies.
One important entrepreneurial venture during the early years is the Apple II personal
computer, which was introduced in 1977. This event has forever changed how society
perceives computers: that computing is made available to individuals and very small
companies.
IBM tossed its hat into the personal computer ring with its release of the IBM personal
computer in 1981. By the end of 1982, 835,000 units had been sold. When software
vendors began to orient their products to the IBM PC, many companies began offering
IBM PC-compatibles or clones. Today, the IBM PC and its clones have become a
powerful standard in the microcomputer industry.
In 1982, Michael Kapor founded the Lotus Development Company, a subsidiary ofIBM.
It introduced an electronic spreadsheet product (Lotus 123) and gave IBM PC credibility
in the business marketplace. Sales of IBM PC and Lotus 123 soared.
In 1984, Apple Macintosh introduced the Macintosh desktop computer with a very
friendly graphical user interface (GUI). This was a proof that computers can be easy
and fun to use. GUI began to change the complexion of the software industry. They
have changed the interaction between the user and the computer from a short,
character-oriented exchange modeled from the teletypewriter to the now famous WIMP
interface (WIMP stands for windows, icons, menus, and pointing devices).
It was in 1985 when Microsoft adopted the GUI in its Windows operating system for IBM
PC compatible computers. Windows did not enjoy widespread acceptance until 1990,
with the release of Windows 3.0. It gave a huge boost to the software industry because
larger, more complex programs could not be run on IBM-PC compatibles. Subsequent
releases made the PC even easier to use, fueling the PC explosion in the 1990s.
In 1991, Linus Torvalds developed LINUX, a reliable and compactly designed operating
system that is an offshoot of UNIX and can be run on many different hardware
platforms. It is available free or at very low cost. LINUX was used as an alternative to
the costly Windows Operating System. In 1993, the IBM-PC compatible PCs started out
using Intel microprocessor chips, then a succession of even more powerful chips. But
not until the Intel Pentium and its successors did PCs do much with multimedia (the
integration of motion, video, animation, graphics, sound, and so on). The emergence of
the high-powered Intel Pentium processors and their ability to handle multimedia
applications changed the way people view and use PCs.
It was also in this year when millions of people began to tune into the Internet for news.
The World Wide Web (WWW), one of several internet-based applications, came of age

8
as Web traffic grew 341.634%. The web is unique that it enabled Web pages to be
linked across the Internet. A number of Internet browsers were introduced (e.g. Mosaic
and Netscape Navigator which were developed by Marc Andreesen, and Internet
Explorer by Microsoft Corporation). These browsers enabled users to navigate the
World Wide Web with ease. Today, WWW is the foundation for most Internet
communications and services. The World Wide Web was actually created in 1991 by
Tim Berners-Lee, an engineer in Geneva, Switzerland.
 Fifth Generation of Computers
The fifth generation of computers is characterized by the very large-scale integrated
(VLSI) circuit (microchip), with many thousands of interconnected transistors etched into
a single silicon substrate. It is also characterized by network computers of all sizes, the
Internet, Intranets, and Extranets.
The year 1996 marked the 50th year of computer history. The US Postal service issued
stamps that commemorated the 50th anniversary of ENIAC, the first full-scale computer
and the 50 years of computer technology that followed. It was during this year when the
handheld computer was introduced and signaled to the world that you can place a
tremendous computing power at the palm of your hand. Nowadays, millions of people
rely on handhelds for a variety of personal information management applications,
including e-mail.
In the year 1999, the world was threatened by the Y2K problem, known as
themillennium bug. It may have been one of the biggest challenges ever to confront the
businesses of the world. For most of the 20th century, information systems had only two
digits to represent the year (e.g. 99 for 1999). But what would happen when the 20th
century ended and a new one begins is that non-compliant computers would interpret
the date 01-01-00 for January 1, 2000 as being January 1, 1900. Y2K heightened
management's awareness of how critical information technology is to the operation of
any organization.
Jack Kilbey's first IC contained a single transistor. Tens of thousands engineers around
the world have built on his invention, such that each year, our society is the beneficiary
of smaller, more powerful, cheaper chips. One continuing trend in computer
development is microminiaturization, the effort to compress more circuit elements into
smaller and smaller chip space. In 1999, scientists developed a circuit the size of a
single layer of molecules, and in 2000 IBM announced that it had developed new
technology to produce computer chips that operate five times faster than the most
advanced models to date. Also in 2000, scientists discovered a way to transfer
information on an atomic level without relying on traditional wires or circuits. This effect,
dubbed the quantum mirage, describes how an atom of matter placed in an elliptical-
shaped structure on a solid surface reflects itself at other points within the ellipse,
thereby relaying information. Researchers are also trying to speed up circuitry functions

9
through the use of superconductivity, the phenomenon of decreased electrical
resistance observed in certain materials at very low temperatures.
Whether we are moving into a fifth generation of computing is a subject of debate since
the concept of generations may no longer fit the continual, rapid changes occurring in
computer hardware, software, data, and networking technologies. But in any case, we
can be sure that progress in computing will continue to accelerate and that the
development of Internet-based technologies and applications will be one of the major
forces driving computing in the 21st century.

Reference:
Buenrostro, J. (2015). Library and Information Science Reviewer. Retrieved December
10, 2017 from https://www.scribd.com/document/317123232/Library-and-
Information-Science-Reviewer

10
IV. COMPUTER HARDWARE
(Maria Dominique R. Miranda)
 Hardware
Hardware refers to the physical elements of a computer. This is also sometime called
the machinery or the equipment of the computer. Examples of hardware in a computer
are the keyboard, the monitor, the mouse and the central processing unit. However,
most of a computer's hardware cannot be seen; in other words, it is not an external
element of the computer, but rather an internal one, surrounded by the computer's
casing (tower). A computer's hardware is comprised of many different parts, but
perhaps the most important of these is the motherboard. The motherboard is made up
of even more parts that power and control the computer.
In contrast to software, hardware is a physical entity. Hardware and software are
interconnected, without software, the hardware of a computer would have no function.
However, without the creation of hardware to perform tasks directed by software via the
central processing unit, software would be useless.
Hardware is limited to specifically designed tasks that are, taken independently, very
simple. Software implements algorithms (problem solutions) that allow the computer to
complete much more complex task
 The Basic Structure of A Computer System Consists of Three Parts

1. CPU
Alternately referred to as a processor, central processor, or microprocessor, the CPU
(pronounced sea-pea-you) is the Central Processing Unit of the computer. A computer's
CPU handles all instructions it receives from hardware and software running on the
computer.
2. INPUT – OUTPUT DEVICES

 Keyboard- A computer keyboard is one of the primary input devices used with a
computer that looks similar to those found on electric typewriters, but with some
additional keys. Keyboards allow you to input letters, numbers, and other symbols
into a computer that can serve as commands or be used to type text.

 Monitor- In computers, a monitor is a computer display and related parts packaged


in a physical unit that is separate from other parts of the computer. Notebook
computers don't have monitors because all the display and related parts are
integrated into the same physical unit with the rest of the computer. In practice, the
terms monitor and display are used interchangeably.
 Modem- Modem is short for "Modulator / Demodulator." It is a hardware component
that allows a computer or other device, such as a router or switch, to connect to the
Internet. It converts or "modulates" an analog signal from a telephone or cable wire

11
to a digital signal that a computer can recognize. Similarly, it converts outgoing
digital data from a computer or other device to an analog signal.

 Mouse- This was invented by Douglas Englebert and was popularized by its
inclusion as standard equipment with the Apple Macintosh. It helps a user navigate
through a graphical computer interface. It is generally mapped so that an on-screen
cursor may be controlled by moving the mouse across a flat surface. There are
many variations on mouse design, but they all work in a similar manner. Some
mouse units feature a scroller, which provides a better way of scrolling through
documents vertically and/or horizontally. The latter optomechanical mouse
eliminates the need for many of the wear-related repairs and maintenance
necessary with purely mechanical mice.

 Joystick- This performs the same function as the mouse. It is favored for computer
games. A joystick usually has a square or rectangular plastic base to which is
attached a vertical stem. Control buttons are located on the base and sometimes on
top of the stem. The stem can be moved in all directions to control the movement of
an object on the screen. The buttons activate various software features, generally
producing on-screen events. A joystick is usually a relative pointing device, moving
an object on the screen when the stem is moved from the centre and stopping the
movement when the stem is released. In industrial control applications, the joystick
can also be an absolute pointing device, with each position of the stem mapped to a
specific on-screen location.

 Speakers- Speakers are one of the most common output devices used with
computer systems. Some speakers are designed to work specifically with
computers, while others can be hooked up to any type of sound system. Regardless
of their design, the purpose of speakers is to produce audio output that can be heard
by the listener

 Printers- These are computer peripherals that put text or a computer-generated


image on paper or on another medium, such as a transparency. Printers can be
categorized in several different ways. The most common distinction is impact and
non-impact.

3. MEMORY

 Primary
Alternatively referred to as internal memory, main memory, main storage, and primary
memory, a primary storage device is a medium that holds memory for short periods of
time while a computer is running. Although it has a much lower access time and faster
performance, it is also about two orders of magnitude more costly than secondary
storage.

12
Random Access Memory (RAM) and cache are both examples of a primary storage
device. The image shows three different types of storage for computer data. Primary
storage's key differences from the others are that it is directly accessible by the CPU, it
is volatile, and it is non-removable.
 Secondary
Secondary storage stores data and instructions when they are not used in processing.
Relatively, they are long-term, non-volatile storage of data outside the CPU or primary
storage. Secondary storage is also known as external storage because it does not use
the computer memory to store data. External storage devices, which may actually be
located within the computer housing, are external to the main circuit board. These
devices store data as charges on a magnetically sensitive medium such as a magnetic
tape or, more commonly, on a disk coated with a fine layer of metallic particles. The
most popular secondary storage devices include the following.
o Magnetic disks - This broad category includes the following.
o Floppy disk - The floppy disk in normal use stores about 800 KB or about 1.4 MB.
o ZIP disk - A ZIP disk is much like a floppy disk but has a greater capacity.
o Hard disk - Hard, or "fixed", disks cannot be removed from their disk-drive
cabinets, which contain the electronics to read and write data on to the magnetic
disk surfaces. Hard disks currently used with personal computers can store from
several hundred megabytes to several gigabytes.
o RAID (Redundant Array of Inexpensive Disks - This is a disk storage technology
to boost disk performance by packing more than 100 smaller disk drives with a
control chip and a specialized software in a single large unit to deliver data over
multiple paths simultaneously.
o Optical disks - These disks use the same laser techniques that are used to
create audio compact discs (CDs). Under this genre are:
o CD-ROM - This is an acronym for compact disc read-only memory, a form of
storage characterized by high capacity (roughly 600 MB) and the use of laser
optics rather than magnetic means for reading data.
o CD-R and CD-RW - In simple definition, these are blank CD-ROM that are ready
for data storage. A CD-R is similar to a WORM which cannot be erased or re-
recorded. A CD-RW is capable of being erased and re-recorded.
o DVD - This is short for digital versatile disc. The group of DVD disc formats
includes various forms of data recording for computer purposes, including discs
that contain pre-recorded data (DVD-ROM) and discs that can be rewritten many
times (DVD-RAM). These are several times the capacity of CD-ROMs. The
simple single-layer version of the DVD holds between 3.7 and 4.38 GB (with
double-layer versions holding 15.9 GB), compared to the 650 MB of CD-ROMs.
These higher capacity discs are used particularly for computer games and in
multimedia applications.

13
o DVD-R and DVD-RW - These are blank optical disks in DVD format ready for
data storage, just like CD-R and CR-RW. 54.6 Output Devices Output devices
enable the user to see the results of the computer's calculations or data
manipulations. They present data in a form the user of the computer can
understand. The most common output device can deliver either the soft copy or
the hard copy of the data. Devices that render soft copy are the following:

Reference:
Buenrostro, J. (2015). Library and Information Science Reviewer. Retrieved December
10, 2017 from https://www.scribd.com/document/317123232/Library-and-
Information-Science-Reviewer

14
V. COMPUTER SOFTWARE
(Raymon Aranda)

 Software
Computer software is a general term that describes computer programs. Software
consists of binary data, CD-ROMs, DVDs, and other types of media that are used to
distribute software can also be called software.
 History Software

o 1820 - First mechanical computer


In 1822, Charles Babbage conceptualized and began developing the Difference Engine,
considered to be the first automatic computing machine.
o 1910
The Computing-Tabulating-Recording Company, the precursor to IBM, was founded on
June 16, 1911. At its beginning, it was a merger of three manufacturing businesses, a
product of the times orchestrated by the financier, Charles Flint. From these humble
beginnings sprang the company that Thomas Watson Sr. would mold into a global force
in technology, management and culture.
o 1937 - Atanasoff-Berry Computer
Mathematician and physicist John Atanasoff, looking for ways to solve equations
automatically, took a drive to clear his thoughts in 1937. At a Mississippi River
roadhouse he jotted on a napkin the basic features of an electronic computing machine.
o 1970 -1979 - Microsoft founded by bill gates and Paul Allen's
First product was basic programming language interpreter for the Altair 8800
microcomputer.
o Released of the Apple I computer
By Steve Jobs, founder of apple computer.
o Oracle developed
Larry Ellison, developer of oracle was originally contracted by CIA to develop a
database program.
 First accounting software
 First spreadsheet software

15
o 1980-1989 - Microsoft office released
There were other word processing software products available before the released of
Microsoft office but the popular business and personal use quickly left them in the dust.
o 1990-1999 - Linux released
Founded by Linus Thorvald's jump starter Free and open source software
o QuickBooks launched
Founded in 1983, first and well known program for similar solution for small business.
o Salesforce.com launched
Founded by former oracle employee first business software develop to run on the cloud
o 2000-2009 - Apple I phone launched
The iPhone made mobile computing easy and accessible
o 2010 - Rise of tablet computing
In 2010, apple, Samsung, and dell all launched their version of tablets computer
o Rise of wearable technology
General released of google glass in 2013 and announcement of apple watch.
Computer software can be found in everything from cars to refrigerators and other
household appliances and serve as on how the technology impacted in our daily life

16
A. OPERATING SYSTEM
(Jane Baylon)

An operating system (OS) is a program that acts as an interface between the user and
the computer hardware and controls the execution of all kinds of programs. OS is a
software program that enables the computer hardware to communicate and operate
with the computer software. Without a computer operating system, a computer and
software programs would be useless. An operating system is software which performs
all the basic tasks like file management, memory management, process management,
handling input and output, and controlling peripheral devices such as disk drives and
printers.
For hardware functions such as input and output and memory allocation , the operating
system acts as an intermediary between programs and the computer hardware,
although the application code is usually executed directly by the hardware and
frequently makes system calls to an OS function or is interrupted by it. Operating
systems are found on many devices that contain a computer – from cellular phones and
video game consoles to web servers and supercomputers.
 Important function of Operating Systems
 Memory Management - refers to management of Primary Memory or Main Memory.
Main memory is a large array of words or bytes where each word or byte has its own
address. Main memory provides a fast storage that can be accessed directly by the
CPU. For a program to be executed, it must in the main memory. An Operating
System does the following activities for memory management:
 Keeps tracks of primary memory, i.e., what part of it are in use by whom, what
parts are not in use.
 In multiprogramming, the OS decides which process will get memory when
and how much.
 Allocates the memory when a process requests it to do so.
 De-allocates the memory when a process no longer needs it or has been
terminated.
 Processor Management - the OS decides which process gets the processor when
and for how much time. This function is called process scheduling. An Operating
System does the following activities for processor management:
 Keeps tracks of processor and status of process. The program responsible for
this task is known as traffic controller.
 Allocates the processor (CPU) to a process.
 De-allocates processor when a process is no longer required.
17
 Device Management - Operating System manages device communication via their
respective drivers. It does the following activities for device management −
 Keeps tracks of all devices. Program responsible for this task is known as the
I/O controller.
 Decides which process gets the device when and for how much time.
 Allocates the device in the efficient way.
 De-allocates devices.
 File Management - file system is normally organized into directories for easy
navigation and usage. These directories may contain files and other directions. An
Operating System does the following activities for file management −
 Keeps track of information, location, uses, status etc. The collective facilities
are often known as file system.
 Decides who gets the resources.
 Allocates the resources.
 De-allocates the resources.
Following are some of the important activities that an Operating System performs:
o Security − By means of password and similar other techniques, it prevents
unauthorized access to programs and data.
o Control over system performance − Recording delays between request for a
service and response from the system.
o Job accounting − Keeping track of time and resources used by various jobs and
users.
o Error detecting aids − Production of dumps, traces, error messages, and other
debugging and error detecting aids.
o Coordination between other softwares and users − Coordination and
assignment of compilers, interpreters, assemblers and other software to the
various users of the computer systems.

 Types of Operating Systems


1. Single- and multi-tasking
A single-tasking system can only run one program at a time, while a multi-tasking
operating system allows more than one program to be running in concurrency. This is
achieved by time-sharing, dividing the available processor time between multiple
processes that are each interrupted repeatedly in time slices by a task-scheduling
subsystem of the operating system.

18
2. Distributed
A distributed operating system manages a group of distinct computers and makes them
appear to be a single computer. The development of networked computers that could
be linked and communicate with each other gave rise to distributed computing.
3. Templated
In an OS, distributed and cloud computing context, templating refers to creating a single
virtual machine image as a guest operating system, then saving it as a tool for multiple
running virtual machines. The technique is used both in virtualization and cloud
computing management, and is common in large server warehouses.
4. Embedded
Embedded operating systems are designed to be used in embedded computer systems.
They are designed to operate on small machines like PDAs with less autonomy. They
are able to operate with a limited number of resources. They are very compact and
extremely efficient by design.
5. Real- time
A real-time operating system is an operating system that guarantees to process events
or data by a specific moment in time. A real-time operating system may be single- or
multi-tasking, but when multitasking, it uses specialized scheduling algorithms so that a
deterministic nature of behavior is achieved.
6. Library
A library operating system is one in which the services that a typical operating system
provides, such as networking, are provided in the form of libraries and composed with
the application and configuration code to construct a unikernel: a specialized, single
address space, machine image that can be deployed to cloud or embedded
environments.

 Examples of Operating System


The three most popular types of operating systems for personal and business
computing include Linux, Windows and Mac.
o Windows - Microsoft Windows is a family of operating systems for personal and
business computers. Windows dominates the personal computer world, offering
a graphical user interface (GUI), virtual memory management, multitasking, and
support for many peripheral devices.
o MacOS - Mac OS is the official name of the Apple Macintosh operating system.
Mac OS features a graphical user interface (GUI) that utilizes windows, icons,
and all applications that run on a Macintosh computer have a similar user
interface.

19
o Linux - Linux is a freely distributed open source operating system that runs on a
number of hardware platforms. The Linux kernel was developed mainly by Linus
Torvalds and it is based on Unix.
There have been many operating systems that were significant in their day but are no
longer so, such as AmigaOS; OS/2 from IBM and Microsoft; classic Mac OS, the non-
Unix precursor to Apple's macOS; BeOS; XTS-300 ; RISC OS ; MorphOS; Haiku;
BareMetal and FreeMint. Some are still used in niche markets and continue to be
developed as minority platforms for enthusiast communities and specialist applications.
The mobile OS is responsible for determining the functions and features available on
your device, such as thumb wheel, keyboards, WAP, synchronization with applications,
email, text messaging and more. The mobile OS will also determine which third-party
applications (mobile apps) can be used on your device.A mobile OS allows
smartphones, tablet PCs and other mobile devices to run applications and programs.
Mobile operating systems include Apple iOS, Google Android , BlackBerry OS and
Windows 10 Mobile.

20
B. PROGRAMMING LANGUAGE AND PROGRAMMING DEVELOPMENT TOOLS
(Elisse Jael Ramos and Ruby Atencio)

Computer Program – executable software that runs on a computer is a series of


instructions that directs a computer to perform tasks that is created by a programmer
using a programmer language.
Programmer – a person who develops a software and writes a computer programs:
individual that composes instructions for computer systems to refer to when performing
a given actions.
Programming Language – the coding system used in writing programs.

 Classification used in Programming Languages


1. Machine Language – 1st generation of programming languages; the only language
that a computer understands and is written in binary notation (0,1).
2. Symbolic or Assembly Language – 2nd generation makes use of symbols
mnemonics (memory aids when writing a program); developed in the 1950s; allows
the programmers to substitutes names for numbers.
3. High Level Language – also called computer language, 3rd generation language
o Compiler – translates an entire program before executing it
o Interpreter – converts and executes one code statement at a time

 Example of High Level Languages:


o BASIC (Beginner’s All-purpose Symbolic Instruction Code) – an interactive
programming created in the 1960s by John Kemeny and Thomas Kurtz.
o C – Developed by Dennis Ritchie at Bell Labs in the mid – 1970s. C has Proved
to be a powerful and flexible language that can be used for a variety of
applications, from business programs to engineering, C is a particularly popular
language for a personal computer programmers because it is relatively small –
it requires less memory than other languages.
o C++ - Developed by Bjarne Stroustrup at Bell Labs, C++ adds object – oriented
features to its predecessor C. C++ is one of most popular programming
language for graphical applications, such as those that run in Windows and
Macintosh.
o COBOL (Common Business Oriented Language) – introduced in 1959 by
representatives from business and government for business applications.
o FORTRAN (Formula Translator) – developed in 1957 by IBM for statistical,
scientific and engineering calculations.
o PL/1 (Programming Language/1) – designed for both scientific and business
applications. Makes use of blocks of statements.

21
o Pascal – named after the mathematician Blaise Pascal. Devised by Niklaus
Wirth between 1968-1971. Facilitates the use of structured programming
techniques.
o APL (A Programming Language) – designed by Kenneth E. Iverson in the early
1960s. it permits users to specify complex algorithms and logical expressions.
Needs a special keyboard to use as a language because it is very symbolic.
o RPG (Report Program Generator) – symbolic language suited for creating
reports from input media.
o C# - basedon C++ and was developed by Microsoft
o OOP (Object-Oriented Programming) – language that allows programmers the
ability to revise and modify existing objects
o Java – object-oriented programming language developed by Sun Microsystems
o Visual Studio – Microsoft’s suite of program development tools
o Visual Basic – based on the BASIC programming language
o Visual C++ - based on C++
o Visual C# - combines the programming elements of C++ with an easier, rapid –
development environment.

*RAD (Rapid Application Development) – programming system that entitles


programmers to quickly build working programs. In general, RAD systems provide
a number of tools to help build graphical users interfaces that would normally take
a large development effort. Two of the most popular RAD systems for windows are
Visual Basic and Delphi.
*Visual Programming Language – users a visual or graphical interface for
creating all source code applications in a RAD environment.
*Delphi – developed by Borland International, Inc. and is based on Pascal: ideal
for building large-scale enterprise and Web applications in a RAD environment.

4. 4GL (Fourth-Generation Language) – a nonprocedural language that enables


users and programmers to access data in a database (Ex. SQL)
o Macro – a series of statements that instructs an application how to complete a
task.
o HTML (Hyper-Text Markup Language) – the language that Web pages are
written in.
o XHTML (Extensible Hyper-Text Markup Language) – gives the developers
more control over the appearance and organizations of their Web pages, allows
websites to be displayed more easily in mobile devices.
o XML (Extensible Markup Language) – used to define documents with a
standard format: XML, itself is not a markup language. Instead, it is a
“metalanguage” that can be used to create markup languages for specific

22
applications: allows Web developers to create customized tags and use
predefined tags to display content appropriately on various devices.
o WML (Wireless Markup Language) – an XML language used to specify content
and user interface for WAP devices; used to design pages for micro browsers.
o DHTML (Dynamic HTML) – allows Web developers to include more graphical
interest and interactivity
o Ruby on Rails (RoR) – an open source Web application framework, written in
Ruby (object-oriented programming language), for developing database-backed
Web applications.

Reference:
Macasaet, K. M. F. V. (2015). 2015 UST College of Education LLE Review: Information
Technology

23
VI. COMPUTER DEVELOPMENT AND EFFECTS ON LIBRARY AND
INFORMATION WORKS

A. Computer Development in Library and Information Works


(Rina Angela R. Rodriguez)

 Computers in libraries
The two types of computers found in libraries are servers and personal computers. A
library’s integrated management system is stored on a server. It may be located outside
of the library in another building, for example, a specially designed and air-conditioned
computer room. In a consortium, in which the system is shared among multiple member
libraries, the server is stored in one location and accessed by member computers via
network.
Personal network linked to the library system server via networks enable staff to carry
out daily library transactions. The computers run client versions of the system software
to search the catalog databases and access library record. This is known as a client-
server relationship. Library patrons use personal computers to search the library’s
online catalog and the internet, for word processing, and send e-mail. One personal
computer can offer all of these activities, but many libraries separate them, limiting
some computers to catalog searching only, others for e-mail or internet research or
word processing. Network computers, which have little local memory, provide internet-
only access in some public areas, and occasionally in staff workspaces.
 Workstations
Personal computers sometimes are referred to as workstations. The term comes
from the idea of a scholar’s workstation, a PC with resources that a scholar or
staff member uses in a work situation. For example, a cataloger’s workstation
may have access to the Library of Congress Subject Headings in electronic
format, and MARC cataloging information, the library system cataloging module,
and the internet.
 Adaptive Technology
Adaptive or assistive technology describes both hardware and software for use
by people who have difficulties using standard computer setups. Some examples
of adaptive technology in libraries are:
o Keyboards with larger keys, those that use a wand or pen to touch the
keys, or are onscreen.
o Large mouse trackballs that are easier to hold and move around, e.g. for
children and elderly or disabled users.
o Voice output software that reads out data displaying on a screen.

24
o Voice recognition software that reads and inputs spoken data
o Closed-circuit TV devices that magnify text.
o Braille software and printers to create, edit, and output braille documents.

 Application of Information Technology in Library


The library is the main information center which can make use of the fat development
Information Technology for the benefits of mankind as a whole. The librarian’s
preference of Information Technology should include all those technologies which are
expected to be used in the library activities/ operations and other library services for
collection, processing, storage, retrieval and dissemination of recorded information, the
fast developing information technologies have showered almost every areas of
application including libraries. In case of libraries, these are good use in the following
environments.
a) Library Management: Library management includes the following activities
which will certainly be geared up by the use of these fast Information Technology
developments: Classification, Cataloguing, Indexing, Database creation,
Database Indexing.
b) Library Automation: Library automation is the concept of reducing the human
intervention in all the library services so that any user can receive the desired
information with the maximum comfort and at the lowest cost. Major areas of the
automation can be classified into two -organization of all library databases and all
housekeeping operations of library.
c) Library Networking: Library networking means a group of Libraries and
information centers are interconnected for some common pattern or design for
information exchange and communication with a view to improve efficiency.
d) Audio-Video Technology: It includes photography, microfilms, microfiches,
audio and tapes, printing, optical disk etc.
e) Technical Communication: Technical Communication consisting of technical
writing, editing, publishing, Desktop publishing (DTP) systems etc.

References:
Wilson, Katie (2006). “Computers in libraries: an introduction for library technicians.”
Binghamton, New York: The Haworth Information Press.
Vijayakumar, A. and Vijayan, Sudhi S. (2011). “Application of information technology in
libraries: an overview.” International Journal of Digital Library Services. vol. 1,
October-December 2011, Issue: 2. Retrieved December 14, 2017 from
http://www.ijodls.in/uploads/3/6/0/3/3603729/vijaya12_144-152.pdf

25
B. Effects of ICT on Libraries
(Fatima May A. Jandoc)

The implementation of information technology in the libraries has demanded new forms
of library services to get more user satisfaction. Digital library service has evolved after
the implementation of IT in the library and information centers. Information technology
has had a significant impact and has successfully changed the characteristics of
information services being generated in libraries. The past two decades have seen
great changes in library due to information technology. The technological advancement
have made significant impact on the growth of knowledge and unlocking of human
potential. In library, the impact is clearly visible on information resources, services, and
people (Manjunatha, 2007).
One of the distinct gifts of information technology has been the invention of devices with
huge storage capacity. CD-ROM’s, DVDs and flash memory cards have changed the
face of libraries. Online access to information has turned many libraries into “Virtual
Libraries” (Mishra, 2001). Now Libraries are changing the way in which information is
stored and disseminated to users.
The next benefit of IT is the automation of library activities. Many in-house operations in
the library like acquisition, processing, circulations, maintenance, serial management
are changed manual to automation. The need for automation arises as to reduce the
effort a time required for these jobs. Now many softwares are available in market for
library automation. IT has helped in establishing library networking and resource sharing
through internet and intranet. Library networks have expanded the limitation of the
scope of resource sharing and information exchange. Today internet is the major
resource for librarians. Application of IT has contributed the improvement in provision of
quick, quality services in the libraries.
Another impact is remote access of variety of commercial and non commercial
information sources i.e. online full text databases, e-journals, e-books, library catalogue
(OPAC) etc. The present day information seekers can access the worldwide information
through internet on their desktop without any time limitation.
The IT has wide ranging impact on library and information work. Information activities
have undergone rapid transformations from conventional methods, consequent upon
introduction of new technologies. This summarized with the help of a table.

Information Activity Conventional Method New Technology


1. Generate, Originate  Writing, Typing  Word Processing, Text
editing, Character
Recognition, voice
Recognition

26
2. Preserve, Store  Manuscript, Paper-  Electronic Publishing,
Print Media Magnetic Storage,
Videotext, Tele-text.
Computer disk, ROM
3. Process  Classification,  Electronic data
Cataloguing, Indexing processing, Artificial
intelligence/ Expert
systems.
4. Retrieval  Catalogues , Indexes  Database management
system, Information
retrieval off-line, On-
line.

5. Dissiminate/  Lists, Bibliographies,  Electronic mail,


Communicate Abstracts, Hard Copies Electronic document
delivery, Computer
conferencing,
Telefacsimile, View data
6. Destroy  Physical Weeding  Magnetic erasers,
Optical erasers, re-use
the medium 27

27
REFERENCES
Buenrostro, J. (2015). Library and Information Science Reviewer. Retrieved December
10, 2017 from https://www.scribd.com/document/317123232/Library-and-
Information-Science-Reviewer
Macasaet, K. M. F. V. (2015). 2015 UST College of Education LLE Review: Information
Technology
Wilson, Katie (2006). “Computers in libraries: an introduction for library technicians.”
Binghamton, New York: The Haworth Information Press.
Vijayakumar, A. and Vijayan, Sudhi S. (2011). “Application of information technology in
libraries: an overview.” International Journal of Digital Library Services. vol. 1,
October-December 2011, Issue: 2. Retrieved December 14, 2017 from
http://www.ijodls.in/uploads/3/6/0/3/3603729/vijaya12_144-152.pdf

28

Potrebbero piacerti anche