Documenti di Didattica
Documenti di Professioni
Documenti di Cultura
INTRODUCTION
(April Dannee F. Ninalga)
An information system (IS) is a set of people, procedures, and resources that collects,
transforms, and disseminates information in an organization. It is a system that accepts
1
data resources as input and process them as information products as output. An
information system can be an organized combination of:
• hardware (physical equipment, machines, media; may be mechanical, electronic,
electrical, magnetic, or optical device)
• software (computer programs and procedures concerned with the operation of
the information system)
• data/information
o Data - streams of raw facts
o Information - processed data
• people (information specialists, librarians, knowledge workers, IT people, etc.)
• communication networks (LAN, client/server networks, internet, intranet, etc.)
A computer-based information system (CBIS) relies on computer hardware and
software for processing and disseminating information. The librarian or information
specialist provides and delivers information systems services, which nowadays is
usually computer-based.
Reference:
Buenrostro, J. (2015). Library and Information Science Reviewer. Retrieved December
10, 2017 from https://www.scribd.com/document/317123232/Library-and-
Information-Science-Reviewer
2
II. HISTORY OF COMPUTING
(April Dannee F. Niñalga)
3
In 1924, after further acquisitions, Computing-Recording-Tabulating Company absorbed
the International Business Machines Corporation (IBM) and assumed that company's
name. Thomas J. Watson, Sr. arrived that same year and began to build the foundering
company into an industrial giant. IBM soon became the country's largest manufacturer
of time clocks and developed and marketed the first electric typewriter. In 1951 the
company entered the computer field. The punched card technology was widely used
until the mid-1950s.
Also in the 19th century, the British mathematician and inventor Charles Babbage
(referred to as the Father of the modern computer) worked out the principles of the
modern digital computer. He conceived a number of machines, such as theDifference
Engine and Analytical engine, the forerunners of the modern computer,that were
designed to handle complicated mathematical problems. One ofBabbage's designs, the
Analytical Engine, had many features of a modern computer. It had an input stream in
the form of a deck of punched cards, a "store" for saving data, a "mill" for arithmetic
operations, and a printer that made a permanent record. Babbage failed to put this idea
into practice, though it may well have been technically possible at that date.
Many historians consider Babbage and his associate, the mathematician Augusta Ada
Byron, Countess of Lovelace and daughter of the poet, Lord Byron, the true pioneers of
the modern digital computer. The latter provided complete details as to exactly how the
analytical engine was to work. Because she described some of the key elements in
computer programming, she was referred to as the "world's first computer programmer".
Early Computers
Analogue computers began to be built in the late 19th century. Early models calculated
by means of rotating shafts and gears. Numerical approximations of equations too
difficult to solve in any other way were evaluated with such machines. Lord Kelvin built a
mechanical tide predictor that was a specialized analogue computer. During World
Wars I and II, mechanical and, later, electrical analogue computing systems were used
as torpedo course predictors in submarines and as bombsight controllers in aircraft.
Another system was designed to predict spring floods in the Mississippi River basin.
In the United States, a prototype electronic machine had been built as early as 1939, by
John Atanasoff and Clifford Berry, at Iowa State College. This prototype and later
research were completed quietly for the development of the Atanasoff-Berry Computer
(ABC). This is considered as the first electronic computing machine. It could only
perform addition and subtraction, and never became operational because of the
involvement of the inventors in US military efforts in World War II.
In 1944, Howard Aiken completed the MARK I computer (also known as the Automatic
Sequence controlled Calculator), the first electromechanical computer. It can solve
mathematical problems 1,000 times faster than existing machines.
4
The first electronic computer to be made operational was the Electronic Numerical
Integrator and Calculator (ENIAC). It was built in 1946 for the US Army to perform
quickly and accurately the complex calculations that gunners needed to aim their
artillery weapons. ENIAC contained 18,000 vacuum tubes and had a speed of several
hundred multiplications per minute, but originally its program was wired into the
processor and had to be manually altered.
The scientists of the Cambridge University in England designed the world's first
electronic computer that stored its program of instructions, the Electronic Delay Storage
Automatic Calculator (EDSAC). This gave more flexibility in the use of the computer.
Two years after (1951), machines were built with program storage, based on the ideas
of the Hungarian-American mathematician John von Neumann of Pennsylvania
University. The instructions, like the data, were stored within a "memory", freeing the
computer from the speed limitations of the paper-tape reader during execution and
permitting problems to be solved without rewiring the computer. This concept gave birth
to the Electronic Discreet Variable Automatic Computer (EDVAC).
During World War II a team of scientists and mathematicians, working at Bletchley Park,
north of London, created one of the first all-electronic digital computers:Colossus. By
December 1943, Colossus, which incorporated 1,500 vacuum tubes, was operational. It
was used by the team headed by Alan Turing, in the largely successful attempt to crack
German radio messages enciphered in the Enigma code.
Reference:
Buenrostro, J. (2015). Library and Information Science Reviewer. Retrieved December
10, 2017 from https://www.scribd.com/document/317123232/Library-and-
Information-Science-Reviewer
5
III. GENERATION OF COMPUTERS
(Darwin Gonzalodo)
6
was an instant hit and there were tremendous demands from business and scientific
organizations.
Third Generation of Computers
Even if the first IC was invented earlier during the era of first generation computers, it
was only in late 1960s when it was introduced, making it possible for many transistors to
be fabricated on one silicon substrate, with interconnecting wiresplated in place. The IC
resulted in a further reduction in price, size, and failure rate. This was the start of third
generation computers (mid-1960s to mid 1970s).
Some historians consider the IBM System/360 of computers the single most important
innovation in the history of computers. It was conceived as a family of computers with
upward compatibility, when a company outgrew one model itcould move up to the next
model without worrying about converting its data. This made all previous computers
obsolete.
In 1964, Beginner's All-purpose Symbolic Instruction Code (BASIC). a high-level
programming language, was developed by John Kemeny and Thomas Kurtz
atDartmouth College. BASIC gained its enormous popularity mostly because it can be
learned and used quickly. The language has changed over the years, from a teaching
language into a versatile and powerful language of both business and scientific
applications.
In 1969, two Bell Telephone Labs software engineers, Dennis Ritchie and Ken
Thompson, developed a multi-user computer system named Multics (Multiplexed
Information and Computing Service). They eventually implemented a rudimentary
operating system they named Unics, as a pun of Multics. Somehow, the name became
UNIX. The most notable feature of this operating system is its portability: the operating
system can run in all types of computers, is machine-independent, and supports multi-
user processing, multitasking, and networking. UNIX is used in high-end workstations
and servers. This is written in C language, which was also developed by Ritchie and
Thompson.
Fourth Generation of Computers
The introduction of large-scale integration of circuitry (more circuits per unit of space) is
the mark of the beginning of fourth generation of computers. The base technology,
though, is still the IC, had significant innovations after two decades have passed. The
computer industry actually experienced a mind-bogging succession of advancements in
the further miniaturization of circuitry, data communications, and the design of computer
hardware and software. The microprocessor became a reality in the mid-1970s with the
introduction of the large-scale integrated (LSI) circuit.
Bill Gates and Paul Allen revolutionized the computer industry. They developed the
BASIC programming language for the first commercially-available microcomputer, the
MITS Altair. After successful completion of the project, the two formed Microsoft
7
Corporation in 1975. Microsoft is now the largest and most influential software company
in the world. Microsoft was given an anonymous boost when its operating system
software, MS-DOS was selected for use by the IBM PC. Gates, now the wealthiest
person in the world, provides the company's vision of new product ideas and
technologies.
One important entrepreneurial venture during the early years is the Apple II personal
computer, which was introduced in 1977. This event has forever changed how society
perceives computers: that computing is made available to individuals and very small
companies.
IBM tossed its hat into the personal computer ring with its release of the IBM personal
computer in 1981. By the end of 1982, 835,000 units had been sold. When software
vendors began to orient their products to the IBM PC, many companies began offering
IBM PC-compatibles or clones. Today, the IBM PC and its clones have become a
powerful standard in the microcomputer industry.
In 1982, Michael Kapor founded the Lotus Development Company, a subsidiary ofIBM.
It introduced an electronic spreadsheet product (Lotus 123) and gave IBM PC credibility
in the business marketplace. Sales of IBM PC and Lotus 123 soared.
In 1984, Apple Macintosh introduced the Macintosh desktop computer with a very
friendly graphical user interface (GUI). This was a proof that computers can be easy
and fun to use. GUI began to change the complexion of the software industry. They
have changed the interaction between the user and the computer from a short,
character-oriented exchange modeled from the teletypewriter to the now famous WIMP
interface (WIMP stands for windows, icons, menus, and pointing devices).
It was in 1985 when Microsoft adopted the GUI in its Windows operating system for IBM
PC compatible computers. Windows did not enjoy widespread acceptance until 1990,
with the release of Windows 3.0. It gave a huge boost to the software industry because
larger, more complex programs could not be run on IBM-PC compatibles. Subsequent
releases made the PC even easier to use, fueling the PC explosion in the 1990s.
In 1991, Linus Torvalds developed LINUX, a reliable and compactly designed operating
system that is an offshoot of UNIX and can be run on many different hardware
platforms. It is available free or at very low cost. LINUX was used as an alternative to
the costly Windows Operating System. In 1993, the IBM-PC compatible PCs started out
using Intel microprocessor chips, then a succession of even more powerful chips. But
not until the Intel Pentium and its successors did PCs do much with multimedia (the
integration of motion, video, animation, graphics, sound, and so on). The emergence of
the high-powered Intel Pentium processors and their ability to handle multimedia
applications changed the way people view and use PCs.
It was also in this year when millions of people began to tune into the Internet for news.
The World Wide Web (WWW), one of several internet-based applications, came of age
8
as Web traffic grew 341.634%. The web is unique that it enabled Web pages to be
linked across the Internet. A number of Internet browsers were introduced (e.g. Mosaic
and Netscape Navigator which were developed by Marc Andreesen, and Internet
Explorer by Microsoft Corporation). These browsers enabled users to navigate the
World Wide Web with ease. Today, WWW is the foundation for most Internet
communications and services. The World Wide Web was actually created in 1991 by
Tim Berners-Lee, an engineer in Geneva, Switzerland.
Fifth Generation of Computers
The fifth generation of computers is characterized by the very large-scale integrated
(VLSI) circuit (microchip), with many thousands of interconnected transistors etched into
a single silicon substrate. It is also characterized by network computers of all sizes, the
Internet, Intranets, and Extranets.
The year 1996 marked the 50th year of computer history. The US Postal service issued
stamps that commemorated the 50th anniversary of ENIAC, the first full-scale computer
and the 50 years of computer technology that followed. It was during this year when the
handheld computer was introduced and signaled to the world that you can place a
tremendous computing power at the palm of your hand. Nowadays, millions of people
rely on handhelds for a variety of personal information management applications,
including e-mail.
In the year 1999, the world was threatened by the Y2K problem, known as
themillennium bug. It may have been one of the biggest challenges ever to confront the
businesses of the world. For most of the 20th century, information systems had only two
digits to represent the year (e.g. 99 for 1999). But what would happen when the 20th
century ended and a new one begins is that non-compliant computers would interpret
the date 01-01-00 for January 1, 2000 as being January 1, 1900. Y2K heightened
management's awareness of how critical information technology is to the operation of
any organization.
Jack Kilbey's first IC contained a single transistor. Tens of thousands engineers around
the world have built on his invention, such that each year, our society is the beneficiary
of smaller, more powerful, cheaper chips. One continuing trend in computer
development is microminiaturization, the effort to compress more circuit elements into
smaller and smaller chip space. In 1999, scientists developed a circuit the size of a
single layer of molecules, and in 2000 IBM announced that it had developed new
technology to produce computer chips that operate five times faster than the most
advanced models to date. Also in 2000, scientists discovered a way to transfer
information on an atomic level without relying on traditional wires or circuits. This effect,
dubbed the quantum mirage, describes how an atom of matter placed in an elliptical-
shaped structure on a solid surface reflects itself at other points within the ellipse,
thereby relaying information. Researchers are also trying to speed up circuitry functions
9
through the use of superconductivity, the phenomenon of decreased electrical
resistance observed in certain materials at very low temperatures.
Whether we are moving into a fifth generation of computing is a subject of debate since
the concept of generations may no longer fit the continual, rapid changes occurring in
computer hardware, software, data, and networking technologies. But in any case, we
can be sure that progress in computing will continue to accelerate and that the
development of Internet-based technologies and applications will be one of the major
forces driving computing in the 21st century.
Reference:
Buenrostro, J. (2015). Library and Information Science Reviewer. Retrieved December
10, 2017 from https://www.scribd.com/document/317123232/Library-and-
Information-Science-Reviewer
10
IV. COMPUTER HARDWARE
(Maria Dominique R. Miranda)
Hardware
Hardware refers to the physical elements of a computer. This is also sometime called
the machinery or the equipment of the computer. Examples of hardware in a computer
are the keyboard, the monitor, the mouse and the central processing unit. However,
most of a computer's hardware cannot be seen; in other words, it is not an external
element of the computer, but rather an internal one, surrounded by the computer's
casing (tower). A computer's hardware is comprised of many different parts, but
perhaps the most important of these is the motherboard. The motherboard is made up
of even more parts that power and control the computer.
In contrast to software, hardware is a physical entity. Hardware and software are
interconnected, without software, the hardware of a computer would have no function.
However, without the creation of hardware to perform tasks directed by software via the
central processing unit, software would be useless.
Hardware is limited to specifically designed tasks that are, taken independently, very
simple. Software implements algorithms (problem solutions) that allow the computer to
complete much more complex task
The Basic Structure of A Computer System Consists of Three Parts
1. CPU
Alternately referred to as a processor, central processor, or microprocessor, the CPU
(pronounced sea-pea-you) is the Central Processing Unit of the computer. A computer's
CPU handles all instructions it receives from hardware and software running on the
computer.
2. INPUT – OUTPUT DEVICES
Keyboard- A computer keyboard is one of the primary input devices used with a
computer that looks similar to those found on electric typewriters, but with some
additional keys. Keyboards allow you to input letters, numbers, and other symbols
into a computer that can serve as commands or be used to type text.
11
to a digital signal that a computer can recognize. Similarly, it converts outgoing
digital data from a computer or other device to an analog signal.
Mouse- This was invented by Douglas Englebert and was popularized by its
inclusion as standard equipment with the Apple Macintosh. It helps a user navigate
through a graphical computer interface. It is generally mapped so that an on-screen
cursor may be controlled by moving the mouse across a flat surface. There are
many variations on mouse design, but they all work in a similar manner. Some
mouse units feature a scroller, which provides a better way of scrolling through
documents vertically and/or horizontally. The latter optomechanical mouse
eliminates the need for many of the wear-related repairs and maintenance
necessary with purely mechanical mice.
Joystick- This performs the same function as the mouse. It is favored for computer
games. A joystick usually has a square or rectangular plastic base to which is
attached a vertical stem. Control buttons are located on the base and sometimes on
top of the stem. The stem can be moved in all directions to control the movement of
an object on the screen. The buttons activate various software features, generally
producing on-screen events. A joystick is usually a relative pointing device, moving
an object on the screen when the stem is moved from the centre and stopping the
movement when the stem is released. In industrial control applications, the joystick
can also be an absolute pointing device, with each position of the stem mapped to a
specific on-screen location.
Speakers- Speakers are one of the most common output devices used with
computer systems. Some speakers are designed to work specifically with
computers, while others can be hooked up to any type of sound system. Regardless
of their design, the purpose of speakers is to produce audio output that can be heard
by the listener
3. MEMORY
Primary
Alternatively referred to as internal memory, main memory, main storage, and primary
memory, a primary storage device is a medium that holds memory for short periods of
time while a computer is running. Although it has a much lower access time and faster
performance, it is also about two orders of magnitude more costly than secondary
storage.
12
Random Access Memory (RAM) and cache are both examples of a primary storage
device. The image shows three different types of storage for computer data. Primary
storage's key differences from the others are that it is directly accessible by the CPU, it
is volatile, and it is non-removable.
Secondary
Secondary storage stores data and instructions when they are not used in processing.
Relatively, they are long-term, non-volatile storage of data outside the CPU or primary
storage. Secondary storage is also known as external storage because it does not use
the computer memory to store data. External storage devices, which may actually be
located within the computer housing, are external to the main circuit board. These
devices store data as charges on a magnetically sensitive medium such as a magnetic
tape or, more commonly, on a disk coated with a fine layer of metallic particles. The
most popular secondary storage devices include the following.
o Magnetic disks - This broad category includes the following.
o Floppy disk - The floppy disk in normal use stores about 800 KB or about 1.4 MB.
o ZIP disk - A ZIP disk is much like a floppy disk but has a greater capacity.
o Hard disk - Hard, or "fixed", disks cannot be removed from their disk-drive
cabinets, which contain the electronics to read and write data on to the magnetic
disk surfaces. Hard disks currently used with personal computers can store from
several hundred megabytes to several gigabytes.
o RAID (Redundant Array of Inexpensive Disks - This is a disk storage technology
to boost disk performance by packing more than 100 smaller disk drives with a
control chip and a specialized software in a single large unit to deliver data over
multiple paths simultaneously.
o Optical disks - These disks use the same laser techniques that are used to
create audio compact discs (CDs). Under this genre are:
o CD-ROM - This is an acronym for compact disc read-only memory, a form of
storage characterized by high capacity (roughly 600 MB) and the use of laser
optics rather than magnetic means for reading data.
o CD-R and CD-RW - In simple definition, these are blank CD-ROM that are ready
for data storage. A CD-R is similar to a WORM which cannot be erased or re-
recorded. A CD-RW is capable of being erased and re-recorded.
o DVD - This is short for digital versatile disc. The group of DVD disc formats
includes various forms of data recording for computer purposes, including discs
that contain pre-recorded data (DVD-ROM) and discs that can be rewritten many
times (DVD-RAM). These are several times the capacity of CD-ROMs. The
simple single-layer version of the DVD holds between 3.7 and 4.38 GB (with
double-layer versions holding 15.9 GB), compared to the 650 MB of CD-ROMs.
These higher capacity discs are used particularly for computer games and in
multimedia applications.
13
o DVD-R and DVD-RW - These are blank optical disks in DVD format ready for
data storage, just like CD-R and CR-RW. 54.6 Output Devices Output devices
enable the user to see the results of the computer's calculations or data
manipulations. They present data in a form the user of the computer can
understand. The most common output device can deliver either the soft copy or
the hard copy of the data. Devices that render soft copy are the following:
Reference:
Buenrostro, J. (2015). Library and Information Science Reviewer. Retrieved December
10, 2017 from https://www.scribd.com/document/317123232/Library-and-
Information-Science-Reviewer
14
V. COMPUTER SOFTWARE
(Raymon Aranda)
Software
Computer software is a general term that describes computer programs. Software
consists of binary data, CD-ROMs, DVDs, and other types of media that are used to
distribute software can also be called software.
History Software
15
o 1980-1989 - Microsoft office released
There were other word processing software products available before the released of
Microsoft office but the popular business and personal use quickly left them in the dust.
o 1990-1999 - Linux released
Founded by Linus Thorvald's jump starter Free and open source software
o QuickBooks launched
Founded in 1983, first and well known program for similar solution for small business.
o Salesforce.com launched
Founded by former oracle employee first business software develop to run on the cloud
o 2000-2009 - Apple I phone launched
The iPhone made mobile computing easy and accessible
o 2010 - Rise of tablet computing
In 2010, apple, Samsung, and dell all launched their version of tablets computer
o Rise of wearable technology
General released of google glass in 2013 and announcement of apple watch.
Computer software can be found in everything from cars to refrigerators and other
household appliances and serve as on how the technology impacted in our daily life
16
A. OPERATING SYSTEM
(Jane Baylon)
An operating system (OS) is a program that acts as an interface between the user and
the computer hardware and controls the execution of all kinds of programs. OS is a
software program that enables the computer hardware to communicate and operate
with the computer software. Without a computer operating system, a computer and
software programs would be useless. An operating system is software which performs
all the basic tasks like file management, memory management, process management,
handling input and output, and controlling peripheral devices such as disk drives and
printers.
For hardware functions such as input and output and memory allocation , the operating
system acts as an intermediary between programs and the computer hardware,
although the application code is usually executed directly by the hardware and
frequently makes system calls to an OS function or is interrupted by it. Operating
systems are found on many devices that contain a computer – from cellular phones and
video game consoles to web servers and supercomputers.
Important function of Operating Systems
Memory Management - refers to management of Primary Memory or Main Memory.
Main memory is a large array of words or bytes where each word or byte has its own
address. Main memory provides a fast storage that can be accessed directly by the
CPU. For a program to be executed, it must in the main memory. An Operating
System does the following activities for memory management:
Keeps tracks of primary memory, i.e., what part of it are in use by whom, what
parts are not in use.
In multiprogramming, the OS decides which process will get memory when
and how much.
Allocates the memory when a process requests it to do so.
De-allocates the memory when a process no longer needs it or has been
terminated.
Processor Management - the OS decides which process gets the processor when
and for how much time. This function is called process scheduling. An Operating
System does the following activities for processor management:
Keeps tracks of processor and status of process. The program responsible for
this task is known as traffic controller.
Allocates the processor (CPU) to a process.
De-allocates processor when a process is no longer required.
17
Device Management - Operating System manages device communication via their
respective drivers. It does the following activities for device management −
Keeps tracks of all devices. Program responsible for this task is known as the
I/O controller.
Decides which process gets the device when and for how much time.
Allocates the device in the efficient way.
De-allocates devices.
File Management - file system is normally organized into directories for easy
navigation and usage. These directories may contain files and other directions. An
Operating System does the following activities for file management −
Keeps track of information, location, uses, status etc. The collective facilities
are often known as file system.
Decides who gets the resources.
Allocates the resources.
De-allocates the resources.
Following are some of the important activities that an Operating System performs:
o Security − By means of password and similar other techniques, it prevents
unauthorized access to programs and data.
o Control over system performance − Recording delays between request for a
service and response from the system.
o Job accounting − Keeping track of time and resources used by various jobs and
users.
o Error detecting aids − Production of dumps, traces, error messages, and other
debugging and error detecting aids.
o Coordination between other softwares and users − Coordination and
assignment of compilers, interpreters, assemblers and other software to the
various users of the computer systems.
18
2. Distributed
A distributed operating system manages a group of distinct computers and makes them
appear to be a single computer. The development of networked computers that could
be linked and communicate with each other gave rise to distributed computing.
3. Templated
In an OS, distributed and cloud computing context, templating refers to creating a single
virtual machine image as a guest operating system, then saving it as a tool for multiple
running virtual machines. The technique is used both in virtualization and cloud
computing management, and is common in large server warehouses.
4. Embedded
Embedded operating systems are designed to be used in embedded computer systems.
They are designed to operate on small machines like PDAs with less autonomy. They
are able to operate with a limited number of resources. They are very compact and
extremely efficient by design.
5. Real- time
A real-time operating system is an operating system that guarantees to process events
or data by a specific moment in time. A real-time operating system may be single- or
multi-tasking, but when multitasking, it uses specialized scheduling algorithms so that a
deterministic nature of behavior is achieved.
6. Library
A library operating system is one in which the services that a typical operating system
provides, such as networking, are provided in the form of libraries and composed with
the application and configuration code to construct a unikernel: a specialized, single
address space, machine image that can be deployed to cloud or embedded
environments.
19
o Linux - Linux is a freely distributed open source operating system that runs on a
number of hardware platforms. The Linux kernel was developed mainly by Linus
Torvalds and it is based on Unix.
There have been many operating systems that were significant in their day but are no
longer so, such as AmigaOS; OS/2 from IBM and Microsoft; classic Mac OS, the non-
Unix precursor to Apple's macOS; BeOS; XTS-300 ; RISC OS ; MorphOS; Haiku;
BareMetal and FreeMint. Some are still used in niche markets and continue to be
developed as minority platforms for enthusiast communities and specialist applications.
The mobile OS is responsible for determining the functions and features available on
your device, such as thumb wheel, keyboards, WAP, synchronization with applications,
email, text messaging and more. The mobile OS will also determine which third-party
applications (mobile apps) can be used on your device.A mobile OS allows
smartphones, tablet PCs and other mobile devices to run applications and programs.
Mobile operating systems include Apple iOS, Google Android , BlackBerry OS and
Windows 10 Mobile.
20
B. PROGRAMMING LANGUAGE AND PROGRAMMING DEVELOPMENT TOOLS
(Elisse Jael Ramos and Ruby Atencio)
21
o Pascal – named after the mathematician Blaise Pascal. Devised by Niklaus
Wirth between 1968-1971. Facilitates the use of structured programming
techniques.
o APL (A Programming Language) – designed by Kenneth E. Iverson in the early
1960s. it permits users to specify complex algorithms and logical expressions.
Needs a special keyboard to use as a language because it is very symbolic.
o RPG (Report Program Generator) – symbolic language suited for creating
reports from input media.
o C# - basedon C++ and was developed by Microsoft
o OOP (Object-Oriented Programming) – language that allows programmers the
ability to revise and modify existing objects
o Java – object-oriented programming language developed by Sun Microsystems
o Visual Studio – Microsoft’s suite of program development tools
o Visual Basic – based on the BASIC programming language
o Visual C++ - based on C++
o Visual C# - combines the programming elements of C++ with an easier, rapid –
development environment.
22
applications: allows Web developers to create customized tags and use
predefined tags to display content appropriately on various devices.
o WML (Wireless Markup Language) – an XML language used to specify content
and user interface for WAP devices; used to design pages for micro browsers.
o DHTML (Dynamic HTML) – allows Web developers to include more graphical
interest and interactivity
o Ruby on Rails (RoR) – an open source Web application framework, written in
Ruby (object-oriented programming language), for developing database-backed
Web applications.
Reference:
Macasaet, K. M. F. V. (2015). 2015 UST College of Education LLE Review: Information
Technology
23
VI. COMPUTER DEVELOPMENT AND EFFECTS ON LIBRARY AND
INFORMATION WORKS
Computers in libraries
The two types of computers found in libraries are servers and personal computers. A
library’s integrated management system is stored on a server. It may be located outside
of the library in another building, for example, a specially designed and air-conditioned
computer room. In a consortium, in which the system is shared among multiple member
libraries, the server is stored in one location and accessed by member computers via
network.
Personal network linked to the library system server via networks enable staff to carry
out daily library transactions. The computers run client versions of the system software
to search the catalog databases and access library record. This is known as a client-
server relationship. Library patrons use personal computers to search the library’s
online catalog and the internet, for word processing, and send e-mail. One personal
computer can offer all of these activities, but many libraries separate them, limiting
some computers to catalog searching only, others for e-mail or internet research or
word processing. Network computers, which have little local memory, provide internet-
only access in some public areas, and occasionally in staff workspaces.
Workstations
Personal computers sometimes are referred to as workstations. The term comes
from the idea of a scholar’s workstation, a PC with resources that a scholar or
staff member uses in a work situation. For example, a cataloger’s workstation
may have access to the Library of Congress Subject Headings in electronic
format, and MARC cataloging information, the library system cataloging module,
and the internet.
Adaptive Technology
Adaptive or assistive technology describes both hardware and software for use
by people who have difficulties using standard computer setups. Some examples
of adaptive technology in libraries are:
o Keyboards with larger keys, those that use a wand or pen to touch the
keys, or are onscreen.
o Large mouse trackballs that are easier to hold and move around, e.g. for
children and elderly or disabled users.
o Voice output software that reads out data displaying on a screen.
24
o Voice recognition software that reads and inputs spoken data
o Closed-circuit TV devices that magnify text.
o Braille software and printers to create, edit, and output braille documents.
References:
Wilson, Katie (2006). “Computers in libraries: an introduction for library technicians.”
Binghamton, New York: The Haworth Information Press.
Vijayakumar, A. and Vijayan, Sudhi S. (2011). “Application of information technology in
libraries: an overview.” International Journal of Digital Library Services. vol. 1,
October-December 2011, Issue: 2. Retrieved December 14, 2017 from
http://www.ijodls.in/uploads/3/6/0/3/3603729/vijaya12_144-152.pdf
25
B. Effects of ICT on Libraries
(Fatima May A. Jandoc)
The implementation of information technology in the libraries has demanded new forms
of library services to get more user satisfaction. Digital library service has evolved after
the implementation of IT in the library and information centers. Information technology
has had a significant impact and has successfully changed the characteristics of
information services being generated in libraries. The past two decades have seen
great changes in library due to information technology. The technological advancement
have made significant impact on the growth of knowledge and unlocking of human
potential. In library, the impact is clearly visible on information resources, services, and
people (Manjunatha, 2007).
One of the distinct gifts of information technology has been the invention of devices with
huge storage capacity. CD-ROM’s, DVDs and flash memory cards have changed the
face of libraries. Online access to information has turned many libraries into “Virtual
Libraries” (Mishra, 2001). Now Libraries are changing the way in which information is
stored and disseminated to users.
The next benefit of IT is the automation of library activities. Many in-house operations in
the library like acquisition, processing, circulations, maintenance, serial management
are changed manual to automation. The need for automation arises as to reduce the
effort a time required for these jobs. Now many softwares are available in market for
library automation. IT has helped in establishing library networking and resource sharing
through internet and intranet. Library networks have expanded the limitation of the
scope of resource sharing and information exchange. Today internet is the major
resource for librarians. Application of IT has contributed the improvement in provision of
quick, quality services in the libraries.
Another impact is remote access of variety of commercial and non commercial
information sources i.e. online full text databases, e-journals, e-books, library catalogue
(OPAC) etc. The present day information seekers can access the worldwide information
through internet on their desktop without any time limitation.
The IT has wide ranging impact on library and information work. Information activities
have undergone rapid transformations from conventional methods, consequent upon
introduction of new technologies. This summarized with the help of a table.
26
2. Preserve, Store Manuscript, Paper- Electronic Publishing,
Print Media Magnetic Storage,
Videotext, Tele-text.
Computer disk, ROM
3. Process Classification, Electronic data
Cataloguing, Indexing processing, Artificial
intelligence/ Expert
systems.
4. Retrieval Catalogues , Indexes Database management
system, Information
retrieval off-line, On-
line.
27
REFERENCES
Buenrostro, J. (2015). Library and Information Science Reviewer. Retrieved December
10, 2017 from https://www.scribd.com/document/317123232/Library-and-
Information-Science-Reviewer
Macasaet, K. M. F. V. (2015). 2015 UST College of Education LLE Review: Information
Technology
Wilson, Katie (2006). “Computers in libraries: an introduction for library technicians.”
Binghamton, New York: The Haworth Information Press.
Vijayakumar, A. and Vijayan, Sudhi S. (2011). “Application of information technology in
libraries: an overview.” International Journal of Digital Library Services. vol. 1,
October-December 2011, Issue: 2. Retrieved December 14, 2017 from
http://www.ijodls.in/uploads/3/6/0/3/3603729/vijaya12_144-152.pdf
28