Sei sulla pagina 1di 67

A project report on

Emerging Trends in Information Technology for

Effective MIS and ethical, social issues in
information system

By Group No. 14
Team members:
Ankita Singh 105
Deepshikha Singh 106
Gaurav Kumar Singh 107
Twinkle Singhania 108
Swapnil Sontakke 109
Chhayarani Tantry 110
Yashodhan Taskhedkar 111
Pankaj Thakare 112
Mehul Yadav 123


1. Introduction
2. Decision Support System
3. Executive Support System
4. ERP & EAM
5. Information Security Audit
6. Balanced Scorecard
7. Business Intelligence
8. Data Mining and Big Data
9. Ethical Issues
10. Social Issues
11. Case Study

MIS is short for management information system or management information services.
Management information system, or MIS, broadly refers to a computer-based system that
provides managers with the tools to organize evaluate and efficiently manage departments
within an organization. In order to provide past, present and prediction information, a
management information system can include software that helps in decision making, data
resources such as databases, the hardware resources of a system, decision support systems,
people management and project management applications, and any computerized processes that
enable the department to run efficiently.
The goals of an MIS are to implement the organizational structure and dynamics of the
enterprise for the purpose of managing the organization in a better way and capturing the
potential of the information system for competitive advantage.
Following are the basic objectives of an MIS:

Capturing Data: Capturing contextual data, or operational information that will

contribute in decision making from various internal and external sources of

Processing Data: The captured data is processed into information needed for planning,
organizing, coordinating, directing and controlling functionalities at strategic, tactical
and operational level. Processing data means:
o making calculations with the data
o sorting data
o classifying data and
o summarizing data

Information Storage: Information or processed data need to be stored for future use.

Information Retrieval: The system should be able to retrieve this information from
the storage as and when required by various users.

Information Propagation: Information or the finished product of the MIS should be

circulated to its users periodically using the organizational network.


A Decision Support System (DSS) is a computer-based information system that
supports business or organizational decision-making activities. DSSs serve the management,
operations, and planning levels of an organization (usually mid and higher management) and
help to make decisions, which may be rapidly changing and not easily specified in advance
(Unstructured and Semi-Structured decision problems). Decision support systems can be either
fully computerized, human or a combination of both.
Following are the components of the Decision Support System:

Database Management System (DBMS): To solve a problem the necessary data may
come from internal or external database. In an organization, internal data are generated
by a system such as TPS and MIS. External data come from a variety of sources such as
newspapers, online data services, databases (financial, marketing, human resources).

Model Management System: It stores and accesses models that managers use to make
decisions. Such models are used for designing manufacturing facility, analyzing the
financial health of an organization, forecasting demand of a product or service, etc.
Support Tools: Support tools like online help; pulls down menus, user interfaces,
graphical analysis, error correction mechanism, facilitates the user interactions with the

There are several ways to classify DSS. Hoi Apple and Whinstone classifies DSS as follows:

Text Oriented DSS: It contains textually represented information that could have a
bearing on decision. It allows documents to be electronically created, revised and
viewed as needed.

Database Oriented DSS: Database plays a major role here; it contains organized and
highly structured data.

Spreadsheet Oriented DSS: It contains information in spread sheets that allows create,
view, modify procedural knowledge and also instructs the system to execute selfcontained instructions. The most popular tool is Excel and Lotus 1-2-3.

Solver Oriented DSS: It is based on a solver, which is an algorithm or procedure

written for performing certain calculations and particular program type.

Rules Oriented DSS: It follows certain procedures adopted as rules.

Rules Oriented DSS: Procedures are adopted in rules oriented DSS. Export system is
the example.

Compound DSS: It is built by using two or more of the five structures explained


Being used by knowledge workers, it is possible to consider using decision

support systems in any knowledge domain. In fact, they are so widespread that
people don't consider that they are using DSS. The spreadsheet is a simple DSS
that is very commonly used in many different situations!

When you use a search engine, you have used a DSS to organize a huge amount of
information, in the form of text files, images and videos, in order to make your

Here we have examples of more complex DSS and how they have been used in
various contexts.

A DSS used in medicine is called a clinical DSS and, in fact, it is said that if used
properly, clinical decision support systems have the potential to change the way
medicine has been taught and practised.

Colorado State has used a DSS to provide information about floods and potential
hazards throughout the State. It includes real-time weather conditions, local and
county data about floods, as well as historical data, floodplain boundaries and
much more.

Real estate investment companies typically use DSS to manage the day to day
running of their businesses. Information about and from each property can be
processed to give access to data across the enterprise that allows for not just day
to day running but also for future planning.

Universities need to fill places every year. Too few students and they
losemoney and may lose funding the following year. Too many and they still lose
money because they will have to bear the extra costs themselves. And, of course,

they want the best students possible! And then there's the issue of predicting how
many students will want to enrol in a particular course. Enter DSS used in central
clearing houses...

DSS have been used to forecast the demand for water in particular areas. Using
information about the local geography, historical information about water
consumption in the area as well as prediction models, planners can predict and
plan for future consumption needs in the area.

DSS have also been used in integrating weather conditions and air traffic
management, for optimizing reservoir operations, auditing health insurance
claims, financial planning for SMALL BUSINESS and designing freight

Of course, many businesses have integrated DSS applications into their day to day
operations in order to analyze large amounts of data such as budget sheets, sales
figures and forecasts. They rapidly sift through available data and are used
extensively to allow faster decision-making, identification of market trends and
improved allocation of resources.


Improves performance and effectiveness of the user

Allows for faster decision-making

Reduces the time taken to solve problems

These combine to save money!

Has been seen to improve collaboration and communication within groups

Reduces training times because the experience of experts is available within the
programs algorithms

Provides more evidence in support of a decision

May increase decision-maker satisfaction

Providing different perspectives to a situation

Helps automate various business systems


Too much emphasis/control given to the machines.

May reduce skill in staff because they become dependent on the computers

Reduction in efficiency because of information overload

Shift of responsibility - easy to blame computer!

Disgruntled employees who feel they are now only doing clerical work

False sense of being objective - humans still feed information in and decide how
exactly to process it.


Enterprise resource planning (ERP) is business management softwareusually a suite of
integrated applicationsthat a company can use to collect, store, manage and interpret data
from many business activities, including:

Product planning, cost

Manufacturing or service delivery
Marketing and sales
Inventory management
Shipping and payment


Separate systems were being maintained during 1960/70 for traditional business functions
like Sales & Marketing, Finance, Human Resources, Manufacturing, and Supply Chain
Management. These systems were often incongruent, hosted in different databases and
required batch updates. It was difficult to manage business processes across business

functions e.g. procurement to pay and sales to cash functions. ERP system grew to replace
the islands of information by integrating these traditional business functions.
Online Transaction Processing (OLTP)

It is a class of information systems that facilitate and manage transaction-oriented

applications, typically for data entry and retrieval transaction processing.

It is used to refer to processing in which the system responds immediately to user


It supports mission-critical tasks through simple queries of operational databases

It includes Sales and Distribution, Business Planning, Production Planning, Shop

Floor Control, and Logistics modules

Online Analytical Processing (OLAP)

It is an approach to answering multi-dimensional analytical queries swiftly

OLAP is part of the broader category of business intelligence, which also

encompasses relational database, report writing and data mining

OLAP consists of three basic analytical operations: consolidation (roll-up), drilldown, and slicing and dicing

It is a decision support tool for management-critical tasks through analytical

investigation of complex data associations

It supplies management with real-time information and permits timely decisions

to improve performance and achieve competitive advantage



An integrated system that operates in (or near) real time without relying on
periodic updates

A common database that supports all applications

A consistent look and feel across modules

Installation of the system with elaborate application/data integration by the

Information Technology (IT) department, provided the implementation is not done
in small steps


Database Configuration
selection of database tables in the thousands
setting the switches in the system

Bolt-on Software
third-party vendors provide specialized functionality software
Supply-Chain Management (SCM) links vendors, carriers, third-party logistics
companies, and information systems providers

Sales forecasting allows Inventory optimization
Chronological history of transactions through data compilation
Order tracking from acceptance through fulfillment
Revenue tracking from invoice through cash reciept
Matching purchase orders ,inventory receipts, and costing
Brings legitimacy and transparency to each bit of statistical data

Facilitates standard product naming

Protects sensitive data by system consolidation

Customization is problematic
ERP is expensive to implement
Reengineering Business processes to suit the ERP system is complicated
High ERP switching costs can increase support and maintenance expenses
Integration of independent businesses can lead to unnecessary dependencies
Due to ERP's architecture (OLTP, On-Line Transaction Processing) ERP systems
are not well suited for production planning and supply chain management (SCM).
Harmonization of ERP systems and training needs a lot of time and money

Enterprise asset management (EAM)

Enterprise asset management (EAM) is the optimal lifecycle management of the physical
assets of an organization. It covers subjects including the design, construction,
commissioning, operations, maintenance and decommissioning/replacement of plant,
equipment and facilities.
"Enterprise" refers to the scope of the assets across departments, locations, facilities and,
potentially, business units.
Enterprise asset management enables companies to drive maintenance best practices and
manage the full asset lifecycle with a complete view of all types of assets and equipment.

It is the optimal lifecycle management of the physical assets of an organization.

It covers subjects including the design, construction, commissioning, operations,

maintenance and decommissioning/replacement of plant, equipment and facilities.

Key to effective asset management is knowing where the asset is in its life-cycle
whether providing value to your business

Further, it provides techniques that are helpful in commissioning assets and

replacing them. Many also help keep track of warranties on assets


Support assets and Measure asset performance

Provide visibility and control over critical assets that affect compliance, risk and
business performance

Move from reactive maintenance mode to preventive and condition based


Increase the useful life of physical assets with improved business processes for an
increased return on assets and enhanced operational efficiency.

Define, organize, and track failure metrics and history

Tracking check-ins and checkouts of assets

Scheduling maintenance and managing contracts pertaining to assets.


Manage all types of assets within a single repository, whether you own assets
across multiple industries, or diverse assets.
Support assets, including fixed plant, mobile, infrastructure, and linear with a
strong foundation and flexible framework
Measure asset performance in a single view and repository with self-service
Move from reactive maintenance mode to preventive and condition-based
Define, organize, and track failure metrics and history with unique capabilities


Improve return on capital assets by integrating physical and financial aspects and
supporting deep collaboration between project lifecycle and service lifecycle
Estimate costs based on material, labor, and equipment requirements
Generate and track actual costs and capture and retain work history to make
informed decisions about future maintenance work
Conduct asset deployment transactions such as move, reinstatement, retire, and
other transactions

Perhaps one of the biggest challenges for companies especially asset intensive
organizationsis how to effectively manage all their different types of assets, without
creating a huge management workload that adversely impacts the bottom line.
Underperforming assets can have many negative effects on your companys operations.
Asset downtime can disrupt production runs and lead to late deliveries. Inadequate
preventive maintenance can result in major unbudgeted expenditures to repair or replace
failing equipment. The key to effective asset management is knowing where a particular
asset is in its lifecycle at any given moment and whether it is providing value to
your business. If you know the current state of an asset, then you can intelligently plan
and budget for updates, replacements and other changes in the assets lifecycle. This
strategic planning can have a huge impact on a companys bottom line

Improve equipment reliability and plant utilization with solutions that offer real-time
collaboration and adaptive procedures. Transform your organization from fragmented and
reactive to customer-centric and predictive.
EAM enables organizations to:
Support all types of diverse assets across multiple industries
Improve productivity with real-time analytics and preventive maintenance schedules
Align the value chain with demand-driven spares, MRO materials and maintenance
management that is integrated with financials and operational scheduling
Enforce compliance to better control risks, adhere to closed-loop inspection procedures,
and ensure worker safety
Streamlining maintenance work is the first step in achieving effective maintenance
strategies and moving from reactive maintenance-mode to preventive and condition-based
maintenance. EAM enables users to create Work Orders with its Activities feature. These
Activities are used to create a library of job plans and standard operating procedures used
to drive Preventive Maintenance (PM) schedules. When implemented, a PM program
helps organizations run more effectively and efficiently, with fewer unexpected
breakdowns. Meters or Counters attached to assets can be used to report and track
operating conditions. EAM offers unique capabilities to define multiple meters for assets
and allows multiple assets to share these meters. Failure Analysis is an important
component of maintenance. With EAM, users can define, organize and track failure
metrics and history.
Work management is at the core of all maintenance operations and includes the
identification of maintenance issues through work requests and the ability to execute the
maintenance work orders. EAM enables users to route work orders through any required
approvals. Maximizing asset availability, increasing plant productivity and decreasing
maintenance costs are the enterprise objectives that work planning and forecasting can
help achieve.
Reduce equipment and maintenance cost
EAM helps reduce equipment and maintenance cost through the effective collection of
asset maintenance costs and work history. EAM enables users to estimate costs and work
history. Estimate costs based on material, labour and equipment requirements. These can
provide the basis for work order approvals. Once the work is done actual costs are
generated and tracked to enable managers to make informed decisions about maintenance
trends in assetss operational costs and work history is also captures and retained.


Planning and scheduling

Work order creation
Maintenance history
Inventory and procurement
Equipment, component and asset tracking for assemblies of equipment.
In some instances, the functionality is extended by the addition of basic financial
management modules, such as accounts payable, cost recording in ledgers, and HR
functions such as a maintenance skills database.

Balanced Scorecard Approach

The Balanced Scorecard is a strategic planning and management system used to align
business activities to the vision and strategy of the organization by monitoring
performance against strategic goals.
To grow your business, increase your market shares, improve your business
processes, and have fanatically loyal customers
To have a clear picture of what your business is really doing and why
To establish a system for measuring your business performance and to allow you to
make the right decisions, on time, every time
To enable your people to do the right things right, the first time

The Balanced Scorecard model suggests that we view the organization from 4

Then Develop metrics, collect data and analyze it relative to each of these

Financial Perspective

What must we do to create sustainable economic value?

Internal Business Process Perspective

To satisfy our stakeholders, what must be our levels of productivity,

efficiency, and quality?

Learning and Growth Perspective

How does our employee performance management system, including

feedback to employees, support high performance?

Customer Perspective

What do our customers require from us and how are we doing according to
those requirements?

Key Implementation Success Factors

Don't view the balanced scorecard strategy as a monthly approach to quick and
easy answers to your problems.

Be persistent and thorough in establishing your strategies, tactics, and key

performance indicators.

Communicate your scorecards and dashboard throughout your business and

advertise how you're doing, what you're doing, and why.

Use teamwork and involve your people in your balanced scorecards.

Obtaining executive sponsorship and commitment

Involving a broad base of leaders, managers and employees in scorecard


Beginning interactive (two-way) communication first

Getting outside help if needed

Scorecard Potential Drawbacks and Criticisms

Lack of a well Defined Strategy
o The balanced scorecard relies on a well defined strategy and understanding
of linkages between strategic objections and metrics. Without this
foundation the implementation could fail
Too much focus on the lagging measures
o Focusing on only the lagging measures may cause a lack of priority or
opportunity for the leading measures
Use of Generic Metrics
o Dont just copy metrics from another firm. Identify the measures that apply
to your strategy and competitive position
Self-serving managers
o Managers whose goal is to achieve a desired result in order to obtain a
bonus or other self-reward
Costly and time-consuming tool


An executive information system (EIS), also known as an executive support system (ESS),
is a type of management information system that facilitates and supports senior executive
information and decision-making needs. It provides easy access to internal and external
information relevant to organizational goals. It is commonly considered a specialized form
of decision support system (DSS)]
EIS emphasizes graphical displays and easy-to-use user interfaces. They offer strong reporting
and drill-down capabilities. In general, EIS are enterprise-wide DSS that help top-level
executives analyze, compare, and highlight trends in important variables so that they can
monitor performance and identify opportunities and problems. EIS and data
warehousing technologies are converging in the marketplace.
In recent years, the term EIS has lost popularity in favor of business intelligence (with the sub
areas of reporting, analytics, and digital dashboards).
EIS helps executives find data according to user-defined criteria and promote informationbased insight and understanding. Unlike a traditional management information
system presentation, EIS can distinguish between vital and seldom-used data, and track
different key critical activities for executives, both which are helpful in evaluating if the
company is meeting its corporate objectives. After realizing its advantages, people have applied
EIS in many areas, especially, in manufacturing, marketing, and finance areas.
Manufacturing is the transformation of raw materials into finished goods for sale, or
intermediate processes involving the production or finishing of semi-manufactures. It is a large
branch of industry and of secondary production. Manufacturing operational control focuses on
day-to-day operations, and the central idea of this process is effectiveness and efficiency.
In an organization, marketing executives duty is managing available marketing resources to
create a more effective future. For this, they need make judgments about risk and uncertainty of
a project and its impact on the company in short term and long term. To assist marketing
executives in making effective marketing decisions, an EIS can be applied. EIS provides sales

forecasting, which can allow the market executive to compare sales forecast with past sales.
EIS also offers an approach to product price, which is found in venture analysis. The market
executive can evaluate pricing as related to competition along with the relationship of product
quality with price charged. In summary, EIS software package enables marketing executives to
manipulate the data by looking for trends, performing audits of the sales data, and calculating
totals, averages, changes, variances, or ratios.
Financial analysis is one of the most important steps to companies today. Executives need to
use financial ratios and cash flow analysis to estimate the trends and make capital investment
decisions. An EIS integrates planning or budgeting with control of performance reporting, and
it can be extremely helpful to finance executives. EIS focuses on financial performance
accountability, and recognizes the importance of cost standards and flexible budgeting in
developing the quality of information provided for all executive levels.
Advantages and disadvantage]
Advantages of EIS

Easy for upper-level executives to use, extensive computer experience is not required in

Provides timely delivery of company summary information

Information that is provided is better understood

EIS provides timely delivery of information. Management can make decisions promptly.
Improves tracking information
Offers efficiency to decision makers

Disadvantages of EIS

System dependent

Limited functionality, by design

Information overload for some managers

Benefits hard to quantify

High implementation costs

System may become slow, large, and hard to manage

Need good internal processes for data management

May lead to less reliable and less secure data

Future trends
The future of executive info systems is not bound by mainframe computer systems. This trend
free executives from learning different computer operating systems, and substantially decreases
implementation costs. Because this trend includes using existing software applications,
executives don't need to learn a new or special language for the EIS package.


An information security audit occurs when a technology team conducts an organizational
review to ensure that the correct and most up-to-date processes and infrastructure are being
applied. An audit also includes a series of tests that guarantee that information security meets
all expectations and requirements within an organization. During this process, employees are
interviewed regarding security roles and other relevant details.
Every organization should perform routine security audits to ensure that data and assets are
protected. First, the audits scope should be decided and include all company assets related to
information security, including computer equipment, phones, network, email, data and any
access-related items, such as cards, tokens and passwords. Then, past and potential future asset
threats must be reviewed. Anyone in the information security field should stay apprised of new
trends, as well as security measures taken by other companies. Next, the auditing team should
estimate the amount of destruction that could transpire under threatening conditions. There
should be an established plan and control for maintaining business operations after a threat has
occurred, which is called an intrusion prevention system.

Need for Information Security Audit

Hackers, viruses and worms are wreaking havoc and causing significant monetary, competitive
and psychological damage. For corporations, mitigating the potential loss involves timely
detection, effective communication and a plan for resolution.
Information security organizations and audit organizations have the same goal: to see that
mission-critical information is properly protected from unauthorized access and/or update. It is
wise for security practitioners to bring audit guidance into a security project -- to include
deploying firewalls and intrusion detection systems -- during the early planning stages. This
will help to ensure that the resulting controls will be appropriately implemented, both
technically and operationally, for protection as well as compliance with security policies that
govern the overall security program.

Internal Audit
The purpose of an internal audit is to provide operations management with an independent
review of the adequacy and effectiveness of the operations internal controls. Your internal
auditing department will expect you to comply with a standard role of controls and guidelines.
They will create a scope of what they are planning to address in their audit and prepare a risk
What is a risk assessment? As you prepare to begin your audit, you need perform a risk
assessment to determine the threats and vulnerabilities creating a risk to the business
environment. The degree of the risk is compared with the adequacy and effectiveness of

controls in place to mitigate the risk. It requires much detail and past experience to conduct are
inspiring audit. Some companies create their own matrixes and scales. Others rely on software
driven programs to help determine the risks. The assessment tool market is small and the
companies providing the tools are very small companies.
The IT auditors may contact your department at this time and let you know their intentions
toward conducting the audit. They may sit down and review the scope and determine who will
be their auditees.
The IT auditors will create several individual test cases towards testing the security of the
server and its environment.
At the conclusion of the audit, usually an oral report is conducted with the management,
accompanied by a written report. At this time you will need to plan actions to take in response
to the report or whether you wish to assume the risks involved.

IT Audit Security Techniques

IT security auditing to assess the security posture of systems and networks can include a
combination of the following:
Network Scanning
Network scanning refers to the use of a computer network to gather information regarding
computing systems. Network scanning is mainly used for security assessment, system
maintenance, and also for performing attacks by hackers. The purpose of network scanning is
as follows:

Recognize available UDP and TCP network services running on the targeted hosts
Recognize filtering systems between the user and the targeted hosts
Determine the operating systems (OSs) in use by assessing IP responses
Evaluate the target host's TCP sequence number predictability to determine sequence
prediction attack and TCP spoofing

Vulnerability scanning
It is the automated process of proactively identifying security vulnerabilities of computing
systems in a network in order to determine if and where a system can be exploited and/or
threatened. While public servers are important for communication and data transfer over the
Internet, they open the door to potential security breaches by threat agents, such as malicious
Vulnerability scanning employs software that seeks out security flaws based on a database of
known flaws, testing systems for the occurrence of these flaws and generating a report of the
findings that an individual or an enterprise can use to tighten the network's security.
Vulnerability scanning typically refers to the scanning of systems that are connected to the
Internet but can also refer to system audits on internal networks that are not connected to the
Internet in order to assess the threat of rogue software or malicious employees in an enterprise.

Password Cracking
Password cracking refers to various measures used to discover computer passwords. This is
usually accomplished by recovering passwords from data stored in, or transported from, a
computer system. Password cracking is done by either repeatedly guessing the password,
usually through a computer algorithm in which the computer tries numerous combinations until
the password is successfully discovered.
Password cracking can be done for several reasons, but the most malicious reason is in order to
gain unauthorized access to a computer without the computer owners awareness. This results
in cybercrime such as stealing passwords for the purpose of accessing banking information.

Virus Detectors
The virus detector installed on the network infrastructure is usually installed on mail servers or
in conjunction with firewalls at the network border of an organization. Server based virus
detection programs can detect viruses before they enter the network or before users download
their e-mail. The other type of virus detection software is installed on end-user machines.
Software detects malicious code in e-mails, USB disks, hard disks, documents and the like but
only for the local host. The software also sometimes detects malicious code from web sites.
This type of virus detection program has less impact on network performance but generally
relies on end-users to update their signatures, a practice that is not always reliable.

Integrity Checkers
Integrity checking tools can detect whether any critical system files have been changed, thus
enabling the system administrator to look for unauthorized alteration of the system.
Integrity checkers examine stored files or network packets to determine if they have been
altered or changed. They can only flag a change as suspicious; they cannot determine if the
change is a genuine virus infection.
These checkers are based on checksums a simple mathematical operation that turns an entire
file or a message into a number. More complex hash functions that result in a fixed string of
encrypted data are also used. The integrity checking process begins with the creation of a
baseline, where checksums or hashes for clean data are computed and saved. Each time the
integrity checker is run, it again makes a checksum or hash computation and compares the
result with the stored value.

Penetration Testing
A penetration test, or pen test, is an attempt to evaluate the security of an IT infrastructure by
safely trying to exploit vulnerabilities. These vulnerabilities may exist in operating systems,

service and application flaws, improper configurations, or risky end-user behavior. Such
assessments are also useful in validating the efficacy of defensive mechanisms, as well as, enduser adherence to security policies.
Penetration tests are typically performed using manual or automated technologies to
systematically compromise servers, endpoints, web applications, wireless networks, network
devices, mobile devices and other potential points of exposure. Once vulnerabilities have been
successfully exploited on a particular system, testers may attempt to use the compromised
system to launch subsequent exploits at other internal resources, specifically by trying to
incrementally achieve higher levels of security clearance and deeper access to electronic assets
and information via privilege escalation.

Audit planning & preparation

The auditor should be adequately educated about the company and its critical business activities
before conducting a data center review. The objective of the data center is to align data center
activities with the goals of the business while maintaining the security and integrity of critical
information and processes. To adequately determine whether or not the clients goal is being
achieved, the auditor should perform the following before conducting the review:

Meet with IT management to determine possible areas of concern

Review the current IT organization chart
Review job descriptions of data center employees
Research all operating systems, software applications and data center equipment
operating within the data center
Review the companys IT policies and procedures
Evaluate the companys IT budget and systems planning documentation
Review the data centers disaster recovery plan

Establishing audit objectives

The next step in conducting a review of a corporate data center takes place when the auditor
outlines the data center audit objectives. Auditors consider multiple factors that relate to data
center procedures and activities that potentially identify audit risks in the operating
environment and assess the controls in place that mitigate those risks. After thorough testing
and analysis, the auditor is able to adequately determine if the data center maintains proper
controls and is operating efficiently and effectively.
Following is a list of objectives the auditor should review:

Personnel procedures and responsibilities including systems and cross-functional

Change management processes are in place and followed by IT and management

Appropriate back up procedures are in place to minimize downtime and prevent loss of
important data
The data center has adequate physical security controls to prevent unauthorized access
to the data center
Adequate environmental controls are in place to ensure equipment is protected from fire
and flooding

Performing the review

The next step is collecting evidence to satisfy data center audit objectives. This involves
traveling to the data center location and observing processes and procedures performed within
the data center. The following review procedures should be conducted to satisfy the predetermined audit objectives:

Data center personnel All data center personnel should be authorized to access the
data center (key cards, login IDs, secure passwords, etc.). Data center employees are
adequately educated about data center equipment and properly perform their jobs.
Vendor service personnel are supervised when doing work on data center equipment.
The auditor should observe and interview data center employees to satisfy their
Equipment The auditor should verify that all data center equipment is working
properly and effectively. Equipment utilization reports, equipment inspection for
damage and functionality, system downtime records and equipment performance
measurements all help the auditor determine the state of data center equipment.
Additionally, the auditor should interview employees to determine if preventative
maintenance policies are in place and performed.
Policies and Procedures All data center policies and procedures should be
documented and located at the data center. Important documented procedures include:
data center personnel job responsibilities, back up policies, security policies, employee
termination policies, system operating procedures and an overview of operating
Physical security / environmental controls The auditor should assess the security of
the clients data center. Physical security includes bodyguards, locked cages, man traps,
single entrances, bolted down equipment, and computer monitoring systems.
Additionally, environmental controls should be in place to ensure the security of data
center equipment. These include: Air conditioning units, raised floors, humidifiers and
uninterruptible power supply.
Backup procedures The auditor should verify that the client has backup procedures
in place in the case of system failure. Clients may maintain a backup data center at a
separate location that allows them to instantaneously continue operations in the instance
of system failure.

Issuing the review report

The data center review report should summarize the auditors findings and be similar in format
to a standard review report. The review report should be dated as of the completion of the

auditor's inquiry and procedures. It should state what the review entailed and explain that a
review provides only "limited assurance" to third parties.

Benefits of Network Security Audits

Network security audits help identify vulnerabilities on your network and network devices

Running services Any service that is running on a network device can be used to
attack a system. A solid network security audit would help you identify all services and
turn off any unnecessary services.
Open ports A network security audit will help you identify all open ports on network
devices and, just like running services, all unneeded ports should be closed to eliminate
the possibility of being used to attack a network device.
Passwords Assessments/audits should evaluate the enterprise password policy and
ensure that the passwords used on the network devices meet the business password
policy of password strength, frequent change, and other requirements.
User Accounts During the audit, you must determine which user accounts are no
longer being used so they can be removed or disabled. Unused user accounts allow for
someone from inside or outside the network to attack and take over the account or may
be an indication of a successful attack of the network.
Unapproved Devices Unapproved or unknown devices such as iPods, Smart Phones
and Wireless Access Points installed on your network must be detected in an audit. Any
or all of these, as well as other devices, can be used to attack the network or steal data
off the network.
Applications The type of applications being used on a system should be identified
during this process. If any dangerous applications are found running on a system, they
should be removed. Also look for software programs that run automatically because
they can be an indicator of a malware infection.


Business intelligence (BI) has two basic different meanings related to the use of the term
intelligence. The primary, less frequently, is the human intelligence capacity applied in business
affairs/activities. Intelligence of Business is a new field of the investigation of the application of
human cognitive faculties and artificial intelligence technologies to the management and
decision support in different business problems.
The second relates to the intelligence as information valued for its currency and relevance. It is
expert information, knowledge and technologies efficient in the management of organizational
and individual business. Therefore, in this sense, business intelligence is a broad category of
applications and technologies for gathering, providing access to, and analyzing data for the
purpose of helping enterprise users make better business decisions. The term implies having a
comprehensive knowledge of all of the factors that affect the business. It is imperative that
firms have an in depth knowledge about factors such as the customers, competitors, business
partners, economic environment, and internal operations to make effective and good quality
business decisions. Business intelligence enables firms to make these kinds of decisions
A specialized field of business intelligence known as competitive intelligence focuses solely on
the external competitive environment. Information is gathered on the actions of competitors and
decisions are made based on this information. Little if any attention is paid to gathering internal
In modern businesses, increasing standards, automation, and technologies have led to vast
amounts of data becoming available. Data warehouse technologies have set up repositories to
store this data. Improved Extract, transform, load (ETL) and even recently Enterprise
Application Integration tools have increased the speedy collecting of data. OLAP reporting
technologies have allowed faster generation of new reports which analyze the data. Business
intelligence has now become the art of sifting through large amounts of data, extracting
pertinent information, and turning that information into knowledge upon which actions can be
Stackowiak et al. (2007) define Business intelligence as the process of taking large amounts of
data, analyzing that data, and presenting a high-level set of reports that condense the essence of
that data into the basis of business actions, enabling management to make fundamental daily
business decisions. (Cui et al, 2007) view BI as way and method of improving business
performance by providing powerful assists for executive decision maker to enable them to have
actionable information at hand. BI tools are seen as technology that enables the efficiency of
business operation by providing an increased value to the enterprise information and hence the
way this information is utilized.

Zeng et al. (2006) define BI as The process of collection, treatment and diffusion of
information that has an objective, the reduction of uncertainty in the making of all strategic
decisions. Experts describe Business intelligence as a business management term used to
describe applications and technologies which are used to gather, provide access to analyze data
and information about an enterprise, in order to help them make better informed business
(Tvrdkov, 2007) describes the basic characteristic for BI tool is that it is ability to collect data
from heterogeneous source, to possess advance analytical methods, and the ability to support
multi users demands.
Zeng et al. (2006) categorized BI technology based on the method of information delivery;
reporting, statistical analysis, ad-hoc analysis and predicative analysis.
The concept of Business Intelligence (BI) is brought up by Gartner Group since 1996. It is
defined as the application of a set of methodologies and technologies, such as J2EE, DOTNET,
Web Services, XML, data warehouse, OLAP, Data Mining, representation technologies, etc, to
improve enterprise operation effectiveness, support management/decision to achieve
competitive advantages. Business Intelligence by today is never a new technology instead of an
integrated solution for companies, within which the business requirement is definitely the key
factor that drives technology innovation. How to identify and creatively address key business
issues is therefore always the major challenge of a BI application to achieve real business
(Golfarelli, 2004) defined BI that includes effective data warehouse and also a reactive
component capable of monitoring the time-critical operational processes to allow tactical and
operational decision-makers to tune their actions according to the company strategy.
(Gangadharan and Swamy, 2004) define BI as the result of in-depth analysis of detailed
business data, including database and application technologies, as well as analysis practices.
(Gangadharan and Swamy, 2004) widen the definition of BI as technically much broader tools,
that includes potentially encompassing knowledge management, enterprise resource planning,
decision support systems and data mining.
BI includes several software for Extraction, Transformation and Loading (ETL), data
warehousing, database query and reporting, (Berson, 2002; Curt Hall, 1999)
multidimensional/on-line analytical processing (OLAP) data analysis, data mining and
OLAP (On-line analytical processing): It refers to the way in which business users can slice
and dice their way through data using sophisticated tools that allow for the navigation of

dimensions such as time or hierarchies. Online Analytical Processing or OLAP provides

multidimensional, summarized views of business data and is used for reporting, analysis,
modeling and planning for optimizing the business. OLAP techniques and tools can be used to
work with data warehouses or data marts designed for sophisticated enterprise intelligence
systems. These systems process queries required to discover trends and analyze critical factors.
Reporting software generates aggregated views of data to keep the management informed about
the state of their business. Other BI tools are used to store and analyze data, such as data mining
and data warehouses; decision support systems and forecasting; document warehouses and
document management; knowledge management; mapping, information visualization, and dash
boarding; management information systems, geographic information systems; Trend Analysis;
Software as a Service (SaaS).
Advanced Analytics: it is referred to as data mining, forecasting or predictive analytics, this
takes advantage of statistical analysis techniques to predict or provide certainty measures on
Corporate Performance Management (Portals, Scorecards, Dashboards): this general
category usually provides a container for several pieces to plug into so that the aggregate tells a
story. For example, a balanced scorecard that displays portlets for financial metrics combined
with say organizational learning and growth metrics.
Real time BI: It allows for the real time distribution of metrics through email, messaging
systems and/or interactive displays.
Data Warehouse and data marts: The data warehouse is the significant component of
business intelligence. It is subject oriented, integrated. The data warehouse supports the
physical propagation of data by handling the numerous enterprise records for integration,
cleansing, aggregation and query tasks. It can also contain the operational data which can be
defined as an updateable set of integrated data used for enterprise wide tactical decision-making
of a particular subject area. It contains live data, not snapshots, and retains minimal history.
Data sources can be operational databases, historical data, external data for example, from
market research companies or from the Internet), or information from the already existing data
warehouse environment. The data sources can be relational databases or any other data
structure that supports the line of business applications. They also can reside on many different
platforms and can contain structured information, such as tables or spreadsheets, or
unstructured information, such as plaintext files or pictures and other multimedia information.
A data mart as described by (Inmon, 1999) is a collection of subject areas organized for
decision support based on the needs of a given department. Finance has their data mart,
marketing has theirs, and sales have theirs and so on. And the data mart for marketing only
faintly resembles anyone else's data mart. Perhaps most importantly, (Inmon, 1999) the
individual departments own the hardware, software, data and programs that constitute the data
mart. Each department has its own interpretation of what a data mart should look like and each

department's data mart is peculiar to and specific to its own needs. Similar to data warehouses,
data marts contain operational data that helps business experts to strategize based on analyses
of past trends and experiences. The key difference is that the creation of a data mart is
predicated on a specific, predefined need for a certain grouping and configuration of select data.
There can be multiple data marts inside an enterprise. A data mart can support a particular
business function, business process or business unit.
A data mart as described by (Inmon, 1999) is a collection of subject areas organized for
decision support based on the needs of a given department. Finance has their data mart,
marketing has theirs, and sales have theirs and so on. And the data mart for marketing only
faintly resembles anyone else's data mart.
BI tools are widely accepted as a new middleware between transactional applications and
decision support applications, thereby decoupling systems tailored to an efficient handling of
business transactions from systems tailored to an efficient support of business decisions. The
capabilities of BI include decision support, online analytical processing, statistical analysis,
forecasting, and data mining. The following are the major components that constitute BI.
Data Sources
Data sources can be operational databases, historical data, external data for example, from
market research companies or from the Internet), or information from the already existing data
warehouse environment. The data sources can be relational databases or any other data
structure that supports the line of business applications. They also can reside on many different
platforms and can contain structured information, such as tables or spreadsheets, or
unstructured information, such as plaintext files or pictures and other multimedia information.

Experts View: Experts view BI in different ways. Data warehousing experts view BI as
supplementary systems and is very new to them. These experts treat BI as technology platform
for decision support application. The author is of opinion that to data mining experts BI is set of
advanced decision support systems with data mining techniques and applications of algorithms.
To statisticians BI is viewed as a forecasting and multidimensional analysis based tool.
Approaches in Data Warehousing: The main key to successful BI system is consolidating
data from the many different enterprise operational systems into an enterprise data warehouse.
Very few organizations have a full-fledged enterprise data warehouse. This is due to the vast
scope of effort towards consolidating the entire enterprise data. (Berson, 2002) emphasizes
that in view of emerging highly dynamic business environment, only the most competitive
enterprises will achieve sustained market success. The organizations will distinguish
themselves by the capability to leverage information about their market place, customers, and
operations to capitalize on the business opportunities.

Analysis of right information: Several surveys including Gartner, Forrester and International
Data Centre report that most of the firms throughout the globe are interested in investing in BI.
It is to be noted that despite major investments in enterprise resource planning (ERP) and
customer relationship management (CRM) over the last decade businesses are struggling to
achieve competitive advantage. It is due to the information captured by these systems. Any
corporate would look forward for one goal called right access to information quickly. Hence,
the firms need to support the analysis and application of information in order to make
operational decisions. Say for marking seasonal merchandise or providing certain
recommendations to customers, firms need right access to information quickly. Implementing
smarter business processes is where business intelligence influences and influences the bottom
line and returns value to any firm.
In this rapidly changing world consumers are now demanding quicker more efficient service
from businesses. To stay competitive companies must meet or exceed the expectations of
consumers. Companies will have to rely more heavily on their business intelligence systems to
stay ahead of trends and future events. Business intelligence users are beginning to demand
Real time Business Intelligence] or near real time analysis relating to their business, particularly
in frontline operations. They will come to expect up to date and fresh information in the same
fashion as they monitor stock quotes online. Monthly and even weekly analysis will not suffice.
In the not too distant future companies will become dependent on real time business
information in much the same fashion as people come to expect to get information on the
internet in just one or two clicks.
Also in the near future business information will become more democratized where end users
from throughout the organization will be able to view information on their particular segment to
see how it's performing.
So, in the future, the capability requirements of business intelligence will increase in the same
way that consumer expectations increase. It is therefore imperative that companies increase at
the same pace or even faster to stay competitive.
Business Intelligence enables organizations to make well informed business decisions and thus
can be the source of competitive advantages. This is especially true when firms are able to
extrapolate information from indicators in the external environment and make accurate
forecasts about future trends or economic conditions. Once business intelligence is gathered
effectively and used proactively then the firms can make decisions that benefit the firms.
The ultimate objective of business intelligence is to improve the timeliness and quality of
information. Timely and good quality information is like having a crystal ball that can give an
indication of what's the best course to take. Business intelligence reveals:
The position of the firm as in comparison to its competitors
Changes in customer behavior and spending patterns
The capabilities of the firm

Market conditions, future trends, demographic and economic information

The social, regulatory, and political environment
What the other firms in the market are doing
Businesses realize that in this very competitive, fast paced and ever-changing business
environment, a key competitive quantity is how quickly they respond and adapt to change.
Business intelligence enables them to use information gathered to quickly and constantly
respond to changes.
The Fig.1 presents an understanding of BI. A BI system in other words is a combination of data
warehousing and decision support systems. The figure also reveals how data from disparate
sources can be extracted and stored to be retrieved for analysis. The basic BI functions and
reports are shown in fig 1.
The primary activities include gathering, preparing and analyzing data. The data itself must be
of high quality. The various sources of data is collected, transformed, cleansed, loaded and
stored in a warehouse. The relevant data is for a specific business area that is extracted from the
data warehouse. A BI organization fully exploits data at every phase of the BI architecture as it
progresses through various levels of informational metamorphosis. The raw data is born in
operational environments, where transactional data pours in from every source and every corner
of the enterprise. Therefore, that is the business intelligent organization vision: A natural flow
of data, from genesis to action. In addition, at each step in the flow, the data is fully exploited to
ensure the increase of information value for the enterprise. The challenge for BI, of course, is to
build any organizations vision.

Fig 1. A basic understanding of BI

BI provides many benefits to companies utilizing it. It can eliminate a lot of the guesswork
within an organization, enhance communication among departments while coordinating
activities, and enable companies to respond quickly to changes in financial conditions,
customer preferences, and supply chain operations. BI improves the overall performance of the
company using it.
Information is often regarded as the second most important resource a company has (a
company's most valuable assets are its people). So when a company can make decisions based
on timely and accurate information, the company can improve its performance. BI also
expedites decision-making, as acting quickly and correctly on information before competing
businesses do can often result in competitively superior performance. It can also improve
customer experience, allowing for the timely and appropriate response to customer problems
and priorities.

The firms have recognized the importance of business intelligence for the masses has arrived.
Some of them are listed below.
With BI superior tools, now employees can also easily convert their business knowledge
via the analytical intelligence to solve many business issues, like increase response rates
from direct mail, telephone, e-mail, and Internet delivered marketing campaigns.
With BI, firms can identify their most profitable customers and the underlying reasons for
those customers loyalty, as well as identify future customers with comparable if not
greater potential.
Analyze click-stream data to improve e-commerce strategies.
Quickly detect warranty-reported problems to minimize the impact of product design
Discover money-laundering criminal activities.
Analyze potential growth customer profitability and reduce risk exposure through more
accurate financial credit scoring of their customers.
Determine what combinations of products and service lines customers are likely to
purchase and when.
Analyze clinical trials for experimental drugs.
Set more profitable rates for insurance premiums.
Reduce equipment downtime by applying predictive maintenance.
Determine with attrition and churn analysis why customers leave for competitors and/or
become the customers.
Detect and deter fraudulent behavior, such as from usage spikes when credit or phone
cards are stolen.
Identify promising new molecular drug compounds.
Customers are the most critical aspect to a company's success. Without them a company cannot
exist. So it is very important that firms have information on their preferences. Firms must
quickly adapt to their changing demands. Business Intelligence enables firms to gather
information on the trends in the marketplace and come up with innovative products or services
in anticipation of customer's changing demands.
Competitors can be a huge hurdle on firms way to success. Their objectives are the same as
firms and that is to maximize profits and customer satisfaction. In order to be successful firms
must stay one step ahead of the competitors. In business we don't want to play the catch up
game because we would have lost valuable market share. Business Intelligence tells what
actions our competitors are taking, so one can make better informed decisions.

Big data is a term for data sets that are so large or complex that traditional data
processing applications are inadequate. Challenges include analysis, capture, data curation,
search, sharring, storage, transfer, visualization, querying and information privacy. The term
often refers simply to the use of predictive analytics or certain other advanced methods to
extract value from data, and seldom to a particular size of data set. Accuracy in big data may
lead to more confident decision making, and better decisions can result in greater operational
efficiency, cost reduction and reduced risk.
Analysis of data sets can find new correlations to "spot business trends, prevent diseases,
combat crime and so on." Scientists, business executives, practitioners of medicine, advertising
and governments alike regularly meet difficulties with large data sets in areas including Internet
search, finance and business informatics. Scientists encounter limitations in e-Science work,
including meteorology, genomics, connectomics, complex physics simulations, biology and
environmental research.
Big data usually includes data sets with sizes beyond the ability of commonly used software
tools to capture, curate, manage, and process data within a tolerable elapsed time. Big data
"size" is a constantly moving target, as of 2012 ranging from a few dozen terabytes to
many petabytes of data. Big data requires a set of techniques and technologies with new forms
of integration to reveal insights from datasets that are diverse, complex, and of a massive scale.
Data Mining
Data Mining is an analytic process designed to explore data (usually large amounts of data typically business or market related - also known as "big data") in search of consistent patterns
and/or systematic relationships between variables, and then to validate the findings by applying
the detected patterns to new subsets of data. The ultimate goal of data mining is prediction and predictive data mining is the most common type of data mining and one that has the most
direct business applications.

The process of data mining consists of three stages:

(1) the initial exploration,
(2) model building or pattern identification with validation/verification, and
(3) deployment (i.e., the application of the model to new data in order to generate predictions).

Stage 1: Exploration. This stage usually starts with data preparation which may involve
cleaning data, data transformations, selecting subsets of records and - in case of data sets with
large numbers of variables ("fields") - performing some preliminary feature selection operations
to bring the number of variables to a manageable range (depending on the statistical methods
which are being considered). Then, depending on the nature of the analytic problem, this first
stage of the process of data mining may involve anywhere between a simple choice of
straightforward predictors for a regression model, to elaborate exploratory analyses using a
wide variety of graphical and statistical methods (seeExploratory Data Analysis (EDA)) in
order to identify the most relevant variables and determine the complexity and/or the general
nature of models that can be taken into account in the next stage.

Stage 2: Model building and validation. This stage involves considering various models and
choosing the best one based on their predictive performance (i.e., explaining the variability in
question and producing stable results across samples). This may sound like a simple operation,
but in fact, it sometimes involves a very elaborate process. There are a variety of techniques
developed to achieve that goal - many of which are based on so-called "competitive evaluation
of models," that is, applying different models to the same data set and then comparing their
performance to choose the best. These techniques - which are often considered the core
of predictive data mining - include: Bagging(Voting, Averaging), Boosting, Stacking (Stacked
Generalizations), and Meta-Learning.

Stage 3: Deployment. That final stage involves using the model selected as best in the previous
stage and applying it to new data in order to generate predictions or estimates of the expected

The concept of Data Mining is becoming increasingly popular as a business information

management tool where it is expected to reveal knowledge structures that can guide decisions
in conditions of limited certainty. Recently, there has been increased interest in developing new
analytic techniques specifically designed to address the issues relevant to business Data
Mining (e.g., Classification Trees), but Data Mining is still based on the conceptual principles
of statistics including the traditional Exploratory Data Analysis (EDA) and modeling and it
shares with them both some components of its general approaches and specific techniques.


In business, data mining is the analysis of historical business activities, stored as static data in
data warehouse databases. The goal is to reveal hidden patterns and trends. Data mining
software uses advanced pattern recognition algorithms to sift through large amounts of data to
assist in discovering previously unknown strategic business information. Examples of what
businesses use data mining for include performing market analysis to identify new product
bundles, finding the root cause of manufacturing problems, to prevent customer attrition and
acquire new customers, cross-selling to existing customers, and profiling customers with more

Human rights[edit]
Data mining of government records particularly records of the justice system (i.e., courts,
prisons) enables the discovery of systemic human rights violations in connection to
generation and publication of invalid or fraudulent legal records by various government

Spatial data mining

Spatial data mining is the application of data mining methods to spatial data. The end objective
of spatial data mining is to find patterns in data with respect to geography. So far, data mining
and Geographic Information Systems (GIS) have existed as two separate technologies, each

with its own methods, traditions, and approaches to visualization and data analysis. Particularly,
most contemporary GIS have only very basic spatial analysis functionality. The immense
explosion in geographically referenced data occasioned by developments in IT, digital mapping,
remote sensing, and the global diffusion of GIS emphasizes the importance of developing datadriven inductive approaches to geographical analysis and modeling.

Project management is the process and activity of planning, organizing, motivating, and
controlling resources, procedures and protocols to achieve specific goals in scientific or daily
problems. A project is a temporary endeavor designed to produce a unique product, service or
result with a defined beginning and end (usually time-constrained, and often constrained by
funding or deliverables) undertaken to meet unique goals and objectives, typically to bring
about beneficial change or added value. The temporary nature of projects stands in contrast
with business as usual (or operations), which are repetitive, permanent, or semi-permanent
functional activities to produce products or services. In practice, the management of these two
systems is often quite different, and as such requires the development of distinct technical skills
and management strategies.
The primary challenge of project management is to achieve all of the project goalsand
objectives while honoring the preconceived constraints. The primary constraints are scope,
time, quality and budget. The secondary and more ambitious challenge is to optimize the
allocation of necessary inputs and integrate them to meet pre-defined objectives.
There are a number of approaches for managing project activities including lean, iterative,
incremental, and phased approaches.
Regardless of the methodology employed, careful consideration must be given to the overall
project objectives, timeline, and cost, as well as the roles and responsibilities of all participants
and stakeholders.
The traditional approach
A traditional phased approach identifies a sequence of steps to be completed. In the "traditional
approach",five developmental components of a project can be distinguished (four stages plus

Typical development phases of an engineering project

1. initiation
2. planning and design
3. execution and construction
4. monitoring and controlling systems
5. completion
Not all projects will have every stage, as projects can be terminated before they reach
completion. Some projects do not follow a structured planning and/or monitoring process. And
some projects will go through steps 2, 3 and 4 multiple times.
Many industries use variations of these project stages. For example, when working on a brickand-mortar design and construction, projects will typically progress through stages like preplanning, conceptual design, schematic design, design development, construction drawings (or
contract documents), and construction administration. In software development, this approach
is often known as the waterfall model, i.e., one series of tasks after another in linear sequence.
In software development many organizations have adapted the Rational Unified Process (RUP)
to fit this methodology, although RUP does not require or explicitly recommend this practice.
Waterfall development works well for small, well defined projects, but often fails in larger
projects of undefined and ambiguous nature. The Cone of Uncertainty explains some of this as
the planning made on the initial phase of the project suffers from a high degree of uncertainty.
This becomes especially true as software development is often the realization of a new or novel
product. In projects where requirements have not been finalized and can change, requirements
management is used to develop an accurate and complete definition of the behavior of software
that can serve as the basis for software development.[21] While the terms may differ from
industry to industry, the actual stages typically follow common steps to problem solving
"defining the problem, weighing options, choosing a path, implementation and evaluation."
Main article: PRINCE2

The PRINCE2 process model

PRINCE2 is a structured approach to project management released in 1996 as a generic project
management method. It combines the original PROMPT methodology (which evolved into the
PRINCE methodology) with IBM's MITP (managing the implementation of the total project)
methodology. PRINCE2 provides a method for managing projects within a clearly defined
PRINCE2 focuses on the definition and delivery of products, in particular their quality
requirements. As such, it defines a successful project as being output-oriented (not activity- or
task-oriented) through creating an agreed set of productsthat define the scope of the project and
provides the basis for planning and control, that is, how then to coordinate people and activities,
how to design and supervise product delivery, and what to do if products and therefore the
scope of the project has to be adjusted if it does not develop as planned.
In the method, each process is specified with its key inputs and outputs and with specific goals
and activities to be carried out to deliver a project's outcomes as defined by its Business Case.
This allows for continuous assessment and adjustment when deviation from the Business Case
is required.
PRINCE2 provides a common language for all participants in the project. The governance
framework of PRINCE2 its roles and responsibilities are fully described and require
tailoring to suit the complexity of the project and skills of the organisation.
Critical chain project management
Main article: Critical chain project management
Critical chain project management (CCPM) is a method of planning and managing project
execution designed to deal with uncertainties inherent in managing projects, while taking into
consideration limited availability of resources (physical, human skills, as well as management
& support capacity) needed to execute projects.
CCPM is an application of the theory of constraints (TOC) to projects. The goal is to increase
the flow of projects in an organization (throughput). Applying the first three of the five focusing
steps of TOC, the system constraint for all projects is identified as are the resources. To exploit

the constraint, tasks on the critical chain are given priority over all other activities. Finally,
projects are planned and managed to ensure that the resources are ready when the critical chain
tasks must start, subordinating all other resources to the critical chain.
The project plan should typically undergo resource leveling, and the longest sequence of
resource-constrained tasks should be identified as the critical chain. In some cases, such as
managing contracted sub-projects, it is advisable to use a simplified approach without resource
In multi-project environments, resource leveling should be performed across projects. However,
it is often enough to identify (or simply select) a single "drum". The drum can be a resource that
acts as a constraint across projects, which are staggered based on the availability of that single
One can also use a "virtual drum" by selecting a task or group of tasks (typically integration
points) and limiting the number of projects in execution at that stage.
Process-based management
Main article: Process-based management
The incorporation of process-based management has been driven by the use of Maturity models
such as the CMMI (capability maturity model integration; see this example of a predecessor)
and ISO/IEC15504 (SPICE software process improvement and capability estimation).
Agile project management
Main article: Agile Project Management

The iteration cycle in agile project management

Agile project management encompasses several iterative approaches, based on the principles of
human interaction management and founded on a process view of human collaboration. Agilebased methodologies are "most typically" employed in software development as well as the
"website, technology, creative, and marketing industries."[24] This sharply contrasts with
traditional approaches such as the Waterfall method. In agile software development or flexible
product development, the project is seen as a series of relatively small tasks conceived and
executed to conclusion as the situation demands in an adaptive manner, rather than as a
completely pre-planned process.

Advocates of this technique claim that:

It is the most consistent project management technique since it involves frequent testing
of the project under development.

It is the only technique in which the client will be actively involved in the project

The only disadvantage with this technique is that it should be used only if the client has
enough time to be actively involved in the project every now and then.

Agile is an umbrella term for multiple project management methodologies, including:

Scrum - A holistic approach to development that focuses on iterative goals set by the
Product Owner through a backlog, which is developed by the Delivery Team through
the facilitation of the Scrum Master.

Extreme Programming (XP) - A set of practices based on a set of principles and values,
with a goal to develop that provides real value by implementing tight feedback loops at
all levels of the development process and using them to steer development. XP
popularized Test Driven Development (TDD) and Pair Programming.

eXtreme Manufacturing (XM) - An agile methodology based on Scrum, Kanban and

Kaizen that facilitates rapid engineering and prototyping.

Crystal Clear - An agile or lightweight methodology that focuses on colocation and

osmotic communication.

Kanban (()?) - A lean framework for process improvement that is

frequently used to manage work in progress (WIP) within agile projects. Kanban has
been specifically applied in software development.

Scrum ban a mixed scrum and kanban approach to project management. It focuses on
taking the flexibility of kanban and adding the structure of scrum to create a new way to
manage projects.

Lean project management

Main article: Lean project management
Lean project management uses the principles from lean manufacturing to focus on delivering
value with less waste and reduced time.
Extreme project management

Main article: Extreme project management

Planning and feedback loops in Extreme programming (XP) with the time frames of the
multiple loops.
In critical studies of project management it has been noted that several PERT based models are
not well suited for the multi-project company environment of today.[citation needed] Most of them are
aimed at very large-scale, one-time, non-routine projects, and currently all kinds of
management are expressed in terms of projects.
Using complex models for "projects" (or rather "tasks") spanning a few weeks has been proven
to cause unnecessary costs and low maneuverability in several cases.[citation needed] The
generalization of Extreme Programming to other kinds of projects is extreme project
management, which may be used in combination with the process modeling and management
principles of human interaction management.
Benefits realization management
Main article: Benefits realisation management
Benefits realization management (BRM) enhances normal project management techniques
through a focus on outcomes (the benefits) of a project rather than products or outputs, and then
measuring the degree to which that is happening to keep a project on track. This can help to
reduce the risk of a completed project being a failure by delivering agreed upon
requirements/outputs but failing to deliver the benefits of those requirements.
In addition, BRM practices aim to ensure the alignment between project outcomes and business
strategies. The effectiveness of these practices is supported by recent research evidencing BRM
practices influencing project success from a strategic perspective across different countries and
An example of delivering a project to requirements might be agreeing to deliver a computer
system that will process staff data and manage payroll, holiday and staff personnel records.
Under BRM the agreement might be to achieve a specified reduction in staff hours required to
process and maintain staff data.

The project development stages

Traditionally, project management includes a number of elements: four to five process groups,
and a control system. Regardless of the methodology or terminology used, the same basic
project management processes will be used. Major process groups generally include:[7]


Planning or design

Production or execution

Monitoring and controlling


In project environments with a significant exploratory element (e.g., research and

development), these stages may be supplemented with decision points (go/no go decisions) at
which the project's continuation is debated and decided. An example is the Phasegate model.

Initiating process group processes

The initiating processes determine the nature and scope of the project.[27] If this stage is not
performed well, it is unlikely that the project will be successful in meeting the business needs.
The key project controls needed here are an understanding of the business environment and
making sure that all necessary controls are incorporated into the project. Any deficiencies
should be reported and a recommendation should be made to fix them.

The initiating stage should include a plan that encompasses the following areas:

analyzing the business needs/requirements in measurable goals

reviewing of the current operations

financial analysis of the costs and benefits including a budget

stakeholder analysis, including users, and support personnel for the project

project charter including costs, tasks, deliverables, and schedules

Planning and design

After the initiation stage, the project is planned to an appropriate level of detail (see example of
a flow-chart). The main purpose is to plan time, cost and resources adequately to estimate the
work needed and to effectively manage risk during project execution. As with the Initiation
process group, a failure to adequately plan greatly reduces the project's chances of successfully
accomplishing its goals.
Project planning generally consists of

determining how to plan (e.g. by level of detail or rolling wave);

developing the scope statement;

selecting the planning team;

identifying deliverables and creating the work breakdown structure;

identifying the activities needed to complete those deliverables and networking the
activities in their logical sequence;

estimating the resource requirements for the activities;

estimating time and cost for activities;

developing the schedule;

developing the budget;

risk planning;

gaining formal approval to begin work.

Additional processes, such as planning for communications and for scope management,
identifying roles and responsibilities, determining what to purchase for the project and holding
a kick-off meeting are also generally advisable.
For new product development projects, conceptual design of the operation of the final product
may be performed concurrent with the project planning activities, and may help to inform the
planning team when identifying deliverables and planning activities.

Executing process group processes[26]

Executing consists of the processes used to complete the work defined in the project plan to
accomplish the project's requirements.
Monitoring and controlling

Monitoring and controlling process group processes

Monitoring and controlling consists of those processes performed to observe project execution
so that potential problems can be identified in a timely manner and corrective action can be
taken, when necessary, to control the execution of the project. The key benefit is that project
performance is observed and measured regularly to identify variances from the project
management plan.
Monitoring and controlling includes:

Measuring the ongoing project activities ('where we are');

Monitoring the project variables (cost, effort, scope, etc.) against the project
management plan and the project performance baseline (where we should be);

Identify corrective actions to address issues and risks properly (How can we get on
track again);

Influencing the factors that could circumvent integrated change control so only
approved changes are implemented.

In multi-phase projects, the monitoring and control process also provides feedback between
project phases, in order to implement corrective or preventive actions to bring the project into
compliance with the project management plan.
Project maintenance is an ongoing process, and it includes:

Continuing support of end-users

Correction of errors

Updates of the software over time

Monitoring and controlling cycle

In this stage, auditors should pay attention to how effectively and quickly user problems are
Over the course of any construction project, the work scope may change. Change is a normal
and expected part of the construction process. Changes can be the result of necessary design
modifications, differing site conditions, material availability, contractor-requested changes,
value engineering and impacts from third parties, to name a few. Beyond executing the change
in the field, the change normally needs to be documented to show what was actually
constructed. This is referred to as change management. Hence, the owner usually requires a
final record to show all changes or, more specifically, any change that modifies the tangible
portions of the finished work. The record is made on the contract documents usually, but not

necessarily limited to, the design drawings. The end product of this effort is what the industry
terms as-built drawings, or more simply, as built. The requirement for providing them is a
norm in construction contracts. Construction document management is a highly important task
undertaken with the aid an online or desktop software system, or maintained through physical
documentation. The increasing legality pertaining to the construction industries maintenance of
correct documentation has caused the increase in the need for document management systems.
When changes are introduced to the project, the viability of the project has to be re-assessed. It
is important not to lose sight of the initial goals and targets of the projects. When the changes
accumulate, the forecasted result may not justify the original proposed investment in the
project. Successful project management identifies these components, and tracks and monitors
progress so as to stay within time and budget frames already outlined at the commencement of
the project.

Closing process group processes.

Closing includes the formal acceptance of the project and the ending thereof. Administrative
activities include the archiving of the files and documenting lessons learned.
This phase consists of:[7]

Contract closure: Complete and settle each contract (including the resolution of any
open items) and close each contract applicable to the project or project phase.

Project close: Finalize all activities across all of the process groups to formally close
the project or a project phase

Also included in this phase is the Post Implementation Review. This is a vital phase of the
project for the project team to learn from experiences and apply to future projects. Normally a
Post Implementation Review consists of looking at things that went well and analysing things
that went badly on the project to come up with lessons learned.
Project controlling and project control systems
Project controlling should be established as an independent function in project management. It
implements verification and controlling function during the processing of a project in order to
reinforce the defined performance and formal goals.[30] The tasks of project controlling are also:

the creation of infrastructure for the supply of the right information and its update

the establishment of a way to communicate disparities of project parameters

the development of project information technology based on an intranet or the

determination of a project key performance index system (KPI)

divergence analyses and generation of proposals for potential project regulations[31]

the establishment of methods to accomplish an appropriate project structure, project

workflow organization, project control and governance

creation of transparency among the project parameters[32]

Fulfillment and implementation of these tasks can be achieved by applying specific methods
and instruments of project controlling. The following methods of project controlling can be

investment analysis

costbenefit analysis

value benefit analysis

expert surveys

simulation calculations

risk-profile analysis

surcharge calculations

milestone trend analysis

cost trend analysis


Project control is that element of a project that keeps it on-track, on-time and within budget.[29]
Project control begins early in the project with planning and ends late in the project with postimplementation review, having a thorough involvement of each step in the process. Projects
may be audited or reviewed while the project is in progress. Formal audits are generally risk or
compliance-based and management will direct the objectives of the audit. An examination may
include a comparison of approved project management processes with how the project is
actually being managed.[34] Each project should be assessed for the appropriate level of control
needed: too much control is too time consuming, too little control is very risky. If project

control is not implemented correctly, the cost to the business should be clarified in terms of
errors and fixes.
Control systems are needed for cost, risk, quality, communication, time, change, procurement,
and human resources. In addition, auditors should consider how important the projects are to
the financial statements, how reliant the stakeholders are on controls, and how many controls
exist. Auditors should review the development process and procedures for how they are
implemented. The process of development and the quality of the final product may also be
assessed if needed or requested. A business may want the auditing firm to be involved
throughout the process to catch problems earlier on so that they can be fixed more easily. An
auditor can serve as a controls consultant as part of the development team or as an independent
auditor as part of an audit.
Businesses sometimes use formal systems development processes. These help assure that
systems are developed successfully. A formal process is more effective in creating strong
controls, and auditors should review this process to confirm that it is well designed and is
followed in practice. A good formal systems development plan outlines:

A strategy to align development with the organizations broader objectives

Standards for new systems

Project management policies for timing and budgeting

Procedures describing the process

Evaluation of quality of change


For many years, parents of District of Columbia public school children complained about buses
running late or not showing up. A federal court appointed an independent transportation
administrator and enlisted Satellite Security Systems, or S3, to track the movements of the
districts buses. S3 provides satellite tracking services to clients such as the District of

Columbia, Fairfax County, state and federal government agencies, police departments, and
private companies.
These services equip each vehicle or person they are monitoring with a tracking device using
global positioning system (GPS) technology. GPS is a navigation system operated by the U.S.
Department of Defense based on satellites that continually broadcast their position, time, and
date. GPS receivers on the ground, which can be attached to vehicles, cell phones, or other
equipment, use information from the satellite signals to calculate their own locations. Cell
phones are now equipped with GPS.
The D.C. public school system is spending $6 million on its GPS tracking system. It is
equipping buses with GPS locators and special-needs children riding those buses with ID cards
that log when they get on and off their buses. Parents receive secret codes that enable them to
use the Internet to track their children. S3s monitoring center picks up GPS information from
the tracking devices and monitors the locations of the buses on video screens. Most of the
monitoring is automated, and the S3 staff intervenes primarily in emergencies. S3 maintains
each days tracking data for long periods, and clients can access historical tracking data if they
S3 provides detailed information to the D.C. public schools: each buss route throughout the
day, when the bus stops, when the doors open and close, the speed, and when the ignition is
turned on and off. The S3 system includes a database with information on the bus passengers
each childs name, address, disabilities, allergies, contact information, and when their school
days begin and end.
David Gilmore, the court-appointed transportation administrator for the D.C. public schools has
seen improvement in bus driver performance. Reports of bus drivers making detours to banks
or to take long lunches are diminishing.
Parents are also pleased. I like that the system lets you watch them, because you never know
whats going on in the bus, says Deneen Prior, whose three children ride D.C. public school
buses. However, she also worries about the location tracking data being misused. I dont want
anybody watching them thats not supposed to be watching them, she notes. Others feel the
same way. Location tracking has benefits, but it also opens the door to potential invasion of
Many people may not like having their physical movements tracked so closely. Location
information might help direct a tow truck to a broken-down car, but it could also be used to find
out where the driver went during the lunch hour.
For similar reasons, privacy advocacy groups have opposed the use of radio-frequency
identification (RFID) tags in consumer items. RFID tags are small silicon chips equipped with
tiny antennas that enable them to communicate with RFID readers and track the location of
items as they move. When placed on individual products, they allow companies to tell exactly
when a product leaves a store or learn more about the actions of consumers buying the
Designer Lauren Scott had planned to add radio frequency tags to the childrens clothing she
designed to help parents keep track of their children. An RFID tag sewn into a childs clothing
could store vital medical information or track the wearers location to prevent children from
being abducted or wandering away. As a result of the controversy surrounding RFID, however,
several of Scotts major customers asked that the tags not be sewn directly into the clothing.
The D.C. public school system faced a real problem in trying to make sure its drivers were
transporting children safely and promptly to school. Location tracking technology provided a

solution, but it also introduced the possibility that information about the people or vehicles S3
tracked could be used for the wrong purpose. Location tracking technology had a similar
impact for designer Lauren Scotts childrens clothing business.
This solution created what we call an ethical dilemma, pitting the legitimate need to know
what drivers of school buses were doing with the fear that such information could be used to
threaten individual privacy. Another ethical dilemma might occur if you were implementing a
new information system that reduced labor costs and eliminated employees jobs. You need to
be aware of the negative impacts of information systems and you need to balance the negative
consequences with the positive ones.

Information systems raise new and often-perplexing ethical problems. This is more true today
than ever because of the challenges posed by the Internet and electronic commerce to the
protection of privacy and intellectual property. Other ethical issues raised by widespread use of
information systems include establishing accountability for the consequences of information
systems, setting standards to safeguard system quality that protect the safety of individuals and
society, and preserving values and institutions considered essential to the quality of life in an
information society. Whether you run your own business or work in a large company, youll be
confronting these issues, and youll need to know how to deal with them.

If your career is in finance and accounting, you will need to ensure that the information
systems you work with are protected from computer fraud and abuse.
If your career is in human resources, you will be involved in developing and enforcing a
corporate ethics policy and in providing special training to sensitize managers and employees to
the new ethical issues surrounding information systems.
If your career is in information systems, you will need to make management aware of the
ethical implications of the technologies used by the firm and help management establish code
of ethics for information systems.
If your career is in manufacturing, production, or operations management, you will need to
deal with data quality and software problems that could interrupt the smooth and accurate flow
of information among disparate manufacturing and production systems and among supply chain
If your career is in sales and marketing, you will need to balance systems that gather and
analyze customer data with the need for protecting consumer privacy.
Examples of failed ethical judgments by Managers
Top three executives convicted for misstating earnings using illegal accounting schemes and
making false representations to shareholders. Bankruptcy declared in 2001.
Second-largest U.S. telecommunications firm. Chief executive convicted for improperly
inflating revenue by billions using illegal accounting methods. Bankruptcy declared in July
2002 with $41 billion in debts.
Indicted for assisting Enron in the creation of financial vehicles that had no business purpose,
enabling Enron to misstate its earnings.
Italys eighth-largest industrial group indicted for misstating more than $5 billion in revenues,
earnings, and assets over several years; senior executives indicted for embezzlement.

Pharmaceutical firm agreed to pay a fine of $150 million for misstating its revenues by $1.5
billion and inflating its stock value.
Gregory Reyes, the CEO of Brocade Communications Systems Inc. until January 2005,
indicted in criminal and civil cases in 2006 of backdating options and concealing millions of
dollars of compensation expenses from shareholders. Nearly 100 other Silicon Valley tech firms
are under investigation for similar practices.
Senior tax accountants of three of the leading Big Four public accounting firms are indicted
by the Justice Department over the selling of abusive tax shelters to wealthy individuals in the
period 2000- 2005. This case is frequently referred to as the largest tax fraud case in history.
Ethical, social, and political issues are closely linked. The ethical dilemma you may face as a
manager of information systems typically is reflected in social and political debate. One way to
think about these relationships. Imagine society as a more or less calm pond on a summer day, a
delicate ecosystem in partial equilibrium with individuals and with social and political
institutions. Individuals know how to act in this pond because social institutions (family,
education, organizations) have developed well-honed rules of behavior, and these are supported
by laws developed in the political sector that prescribe behavior and promise sanctions for
violations. Now toss a rock into the center of the pond. But imagine instead of a rock that the
disturbing force is a powerful shock of new information technology and systems hitting a
society more or less at rest. What happens? Ripples, of course. Suddenly individual actors are
confronted with new situations often not covered by the old rules. Social institutions cannot
respond overnight to these ripplesit may take years to develop etiquette, expectations, social
responsibility, politically correct attitudes, or approved rules. Political institutions also require
time before developing new laws and often require the demonstration of real harm before they
act. In the meantime, you may have to act. You may be forced to act in a legal gray area.
We can use this model to illustrate the dynamics that connect ethical, social, and political
issues. This model is also useful for identifying the main moral dimensions of the information
society, which cut across various levels of actionindividual, social, and political.


The major ethical, social, and political issues raised by information systems include the
following moral dimensions:
Information rights and obligations: What information rights do individuals and organizations
possess with respect to themselves? What can they protect? What obligations do individuals
and organizations have concerning this information?
Property rights and obligations: How will traditional intellectual property rights be protected in
a digital society in which tracing and accounting for ownership are difficult and ignoring such
property rights is so easy?
Accountability and control: Who can and will be held accountable and liable for the harm done
to individual and collective information and property rights?
System quality: What standards of data and system quality should we demand to protect
individual rights and the safety of society?
Quality of life: What values should be preserved in an information- and knowledge-based
society? Which institutions should we protect from violation? Which cultural values and
practices are supported by the new information technology?


Computing power doubles every 18 months
organizations depend on computer systems for critical operations.
Data storage costs rapidly declining
easily maintain detailed databases on individuals.

Organizations can

Data analysis advances

Companies can analyze vast quantities of data gathered on individuals to develop detailed
profiles of individual behavior.
Networking advances and the Internet
Copying data from one location to another and accessing personal data from remote locations
are much easier.
Ethical issues long preceded information technology. Nevertheless, information technology has
heightened ethical concerns, taxed existing social arrangements, and made some laws obsolete
or severely crippled. Information technologies and systems have also created new opportunities
for criminal behavior and mischief. There are four key technological trends responsible for
these ethical stresses The doubling of computing power every 18 months has made it possible
for most organizations to use information systems for their core production processes. As a
result, our dependence on systems and our vulnerability to system errors and poor data quality
have increased. The very same information systems that lead to high levels of productivity also
create opportunities for abuse. Social rules and laws have not yet adjusted to this dependence.
Standards for ensuring the accuracy and reliability of information systems are not universally
accepted or enforced. Advances in data storage techniques and rapidly declining storage costs
have been responsible for the multiplying databases on individualsemployees, customers, and
potential customersmaintained by private and public organizations.
These advances in data storage have made the routine violation of individual privacy both
cheap and effective. Already, massive data storage systems are cheap enough for regional and
even local retailing firms to use in identifying customers. For instance, the major search firms
like Google, America Online (AOL), MSN, and Yahoo! maintain detailed search histories on
the more than 75 million Americans who use Internet search engines everyday and who
generate more than 200 million searches each day. These huge collections of consumer
intentions become the natural targets of private firms looking for market advantage,
government agencies, and private investigators.
Advances in data analysis techniques for large pools of data are another technological trend that
heightens ethical concerns because companies and government agencies are able to find out
much detailed personal information about individuals. With contemporary data management
tools, companies can assemble and combine the myriad pieces of information about you stored
on computers much more easily than in the past.
Think of all the ways you generate computer information about yourself credit card
purchases; telephone calls; magazine subscriptions; video rentals; mail-order purchases;
banking records; local, state, and federal government records (including court and police
records); and visits to Web sites to read Web materials, use search engines, and write blogs (see
Chapter 10). Put together and mined properly, this information could reveal not only your credit
information but also your driving habits, your tastes, your associations, intended purchases,
political views, and interests. What you thought was private, in fact, can quickly become public.
Companies with products to sell purchase relevant information from these sources to help them
more finely target their marketing campaigns. Chapters 3 and 6 describe how companies can
analyze large pools of data from multiple sources to rapidly identify buying patterns of

customers and suggest individual responses. The use of computers to combine data from
multiple sources and create electronic dossiers of detailed information on individuals is called
For example, hundreds of Web sites allow DoubleClick (, an Internet
advertising broker, to track the activities of their visitors in exchange for revenue from
advertisements based on visitor information DoubleClick gathers. DoubleClick uses this
information to create a profile of each online visitor, adding more detail to the profile as the
visitor accesses an associated DoubleClick site. Over time, DoubleClick can create a detailed
dossier of a persons spending and computing habits on the Web that can be sold to companies
to help them target their Web ads more precisely.
ChoicePoint, described in the Interactive Session on Management, gathers data from police,
criminal, and motor vehicle records; credit and employment histories; current and previous
addresses; professional licenses; and insurance claims to assemble and maintain electronic
dossiers on almost every adult in the United Sates. The company sells this personal information
to businesses and government agencies. Demand for personal data is so enormous that data
broker businesses such as ChoicePoint are booming. A new data analysis technology called
nonobvious relationship awareness (NORA) has given both the government and the private
sector even more powerful profiling capabilities. NORA can take information about people
from many disparate sources, such as employment applications, telephone records, customer
listings, and wanted lists, and correlate relationships to find obscure hidden connections that
might help identify criminals or terrorists. For instance, an applicant for a government security
job might have received phone calls from a person wanted by the police. This diad (grouping of
two) might also share the same religion, attend the same church, and be part of a small group
with frequent telephone contacts.
NORA technology scans data and extracts information as the data are being generated so that it
could, for example, instantly discover a man at an airline ticket counter who shares a phone
number with a known terrorist before that person boards an airplane. The technology is
considered a valuable tool for homeland security but does have privacy implications because it
can provide such a detailed picture of the activities and associations of a single individual.
Finally, advances in networking, including the Internet, promise to reduce greatly the costs of
moving and accessing large quantities of data and open the possibility of mining large pools of
data remotely using small desktop machines, permitting an invasion of privacy on a scale and
with a precision heretofore unimaginable. If computing and networking technologies continue
to advance at the same pace as in the past, by 2023, large organizations will be able to devote
the equivalent of a contemporary desktop personal computer to monitoring each of the 350
million individuals who will then be living in the United States.
The development of global digital communication networks widely available to individuals and
businesses poses many ethical and social concerns. Who will account for the flow of
information over these networks? Will you be able to trace information collected about you?
What will these networks do to the traditional relationships between family, work, and leisure?
How will traditional job designs be altered when millions of employees become
subcontractors using mobile offices for which they themselves must pay?

Ethical choices are decisions made by individuals who are responsible for the consequences of
their actions. Responsibility is a key element of ethical action. Responsibility means that you
accept the potential costs, duties, and obligations for the decisions you make. Accountability is
a feature of systems and social institutions: It means that mechanisms are in place to determine
who took responsible action, who is responsible. Systems and institutions in which it is
impossible to find out who took what action are inherently incapable of ethical analysis or
ethical action.
Liability extends the concept of responsibility further to the area of laws. Liability is a feature
of political systems in which a body of laws is in place that permits individuals to recover the
damages done to them by other actors, systems, or organizations. Due process is a related
feature of law-governed societies and is a process in which laws are known and understood and
there is an ability to appeal to higher authorities to ensure that the laws are applied correctly.
These basic concepts form the underpinning of an ethical analysis of information systems and
those who manage them. First, information technologies are filtered through social institutions,
organizations, and individuals.
Systems do not have impacts by themselves. Whatever information system impacts exist are
products of institutional, organizational, and individual actions and behaviors. Second,
responsibility for the consequences of technology falls clearly on the institutions, organizations,
and individual managers who choose to use the technology. Using information technology in a
socially responsible manner means that you can and will be held accountable for the
consequences of your actions. Third, in an ethical, political society, individuals and others can
recover damages done to them through a set of laws characterized by due process.

When confronted with a situation that seems to present ethical issues, how should you analyze
it? The following five-step process should help.
1. Identify and describe clearly the facts. Find out who did what to whom, and where, when,
and how. In many instances, you will be surprised at the errors in the initially reported facts,
and often you will find that simply getting the facts straight helps define the solution. It also
helps to get the opposing parties involved in an ethical dilemma to agree on the facts.
2. Define the conflict or dilemma and identify the higher-order values involved. Ethical, social,
and political issues always reference higher values. The parties to a dispute all claim to be
pursuing higher values (e.g., freedom, privacy, protection of property, and the free enterprise
system). Typically, an ethical issue involves a dilemma: two diametrically opposed courses of
action that support worthwhile values. For example, the chapter-ending case study illustrates
two competing values: the need to protect citizens from terrorist acts and the need to protect
individual privacy.
3. Identify the stakeholders. Every ethical, social, and political issue has stakeholders: players in
the game who have an interest in the outcome, who have invested in the situation, and usually
who have vocal opinions. Find out the identity of these groups and what they want. This will be
useful later when designing a solution.
4. Identify the options that you can reasonably take. You may find that none of the options
satisfy all the interests involved, but that some options do a better job than others. Sometimes
arriving at a good or ethical solution may not always be a balancing of consequences to
5. Identify the potential consequences of your options. Some options may be ethically correct
but disastrous from other points of view. Other options may work in one instance but not in
other similar instances. Always ask yourself, What if I choose this option consistently over
Once your analysis is complete, what ethical principles or rules should you use to make a
decision? What higher-order values should inform your judgment? Although you are the only
one who can decide which among many ethical principles you will follow, and how you will
prioritize them, it is helpful to consider some ethical principles with deep roots in many cultures
that have survived throughout recorded history.

Do unto others as you would have them do unto you (the Golden Rule). Putting
yourself into the place of others, and thinking of yourself as the object of the decision, can
help you think about fairness in decision making.

2. If an action is not right for everyone to take, it is not right for anyone (Immanuel Kants

Categorical Imperative). Ask yourself, If everyone did this, could the organization, or
society, survive?
3. If an action cannot be taken repeatedly, it is not right to take at all (Descartes rule of
change). This is the slippery-slope rule: An action may bring about a small change now that
is acceptable, but if it is repeated, it would bring unacceptable changes in the long run. In the
vernacular, it might be stated as once started down a slippery path, you may not be able to
4. Take the action that achieves the higher or greater value (the Utilitarian Principle). This rule
assumes you can prioritize values in a rank order and understand the consequences of
various courses of action.
5. Take the action that produces the least harm or the least potential cost (Risk Aversion
Principle). Some actions have extremely high failure costs of very low probability (e.g.,
building a nuclear generating facility in an urban area) or extremely high failure costs of
moderate probability (speeding and automobile accidents). Avoid these high-failure-cost
actions, paying greater attention obviously to high-failure-cost potential of moderate to high
6. Assume that virtually all tangible and intangible objects are owned by someone else unless
there is a specific declaration otherwise. (This is the ethical no free lunch rule.) If
something someone else has created is useful to you, it has value, and you should assume the
creator wants compensation for this work.
Although these ethical rules cannot be guides to action, actions that do not easily pass these
rules deserve some very close attention and a great deal of caution. The appearance of unethical
behavior may do as much harm to you and your company as actual unethical behavior.




Privacy is the claim of individuals to be left alone, free from surveillance or interference
from other individuals or organizations, including the state
Claims to privacy are also involved at the workplace: Millions of employees are subject
to electronic and other forms of high-tech surveillance (Ball, 2001)
Information technology and systems threaten individual claims to privacy by making
the invasion of privacy cheap, profitable, and effective


Most American and European privacy law is based on a regime called Fair Information
Practices (FIP) first set forth in a report written in 1973 by a federal government advisory
committee (U.S. Department of Health, Education, and Welfare, 1973). Fair Information
Practices (FIP) is a set of principles governing the collection and use of information about
individuals. FIP principles are based on the notion of a mutuality of interest between the record
holder and the individual. The individual has an interest in engaging in a transaction, and the
record keeperusually a business or government agencyrequires information about the
individual to support the transaction. Once information is gathered, the individual maintains an
interest in the record, and the record may not be used to support other activities without the
individuals consent.

In July 1998, the U.S. Congress passed the Childrens Online Privacy Protection Act (COPPA),
requiring Web sites to obtain parental permission before collecting information on children
under the age of 13. The FTC has recommended additional legislation to protect online
consumer privacy in advertising networks that collect records of consumer Web activity to
develop detailed profiles, which are then used by other companies to target online ads. Other
proposed Internet privacy legislation focuses on protecting the online use of personal
identification numbers, such as social security
numbers; protecting personal information collected on the Internet that deals with individuals
not covered by the Childrens Online Privacy Protection Act of 1998; and limiting the use of
data mining for homeland security (see the chapter-ending case study).
Privacy protections have also been added to recent laws deregulating financial services and
safeguarding the maintenance and transmission of health information about individuals. The
Gramm-Leach-Bliley Act of 1999, which repeals earlier restrictions on affiliations among
banks, securities firms, and insurance companies, includes some privacy protection for
consumers of financial services. All financial institutions are required to disclose their policies
and practices for protecting the privacy of nonpublic personal information and to allow
customers to opt out of information-sharing arrangements with nonaffiliated third parties.

1. Notice/awareness (core principle) - Web sites must disclose their information practices
before collecting data. Includes identification of collector; uses of data; other recipients
of data; nature of collection (active/inactive); voluntary or required status; consequences
of refusal; and steps taken to protect confidentiality, integrity, and quality of the data
2. Choice/consent (core principle) - There must be a choice regime in place allowing
consumers to choose how their information will be used for secondary purposes other
than supporting the transaction, including internal use and transfer to third parties
3. Access/participation - Consumers should be able to review and contest the accuracy
and completeness of data collected about them in a timely, inexpensive process
4. Security - Data collectors must take responsible steps to assure that consumer
information is accurate and secure from unauthorized use
5. Enforcement - There must be in place a mechanism to enforce FIP principles. This can
involve self-regulation, legislation giving consumers legal remedies for violations, or
federal statutes and regulations
Internet technology has posed new challenges for the protection of individual privacy.
Information sent over this vast network of networks may pass through many different computer
systems before it reaches its final destination. Each of these systems is capable of monitoring,
capturing, and storing communications that pass through it.
It is possible to record all online activities of literally tens of millions of people, including
which online newsgroups or files a person has accessed, which Web sites and Web pages he or
she has visited, and what items that person has inspected or purchased over the Web. Much of
this monitoring and tracking of Web site visitors occurs in the background without the visitors
knowledge. Tools to monitor visits to the World Wide Web have become popular because they
help organizations determine who is visiting their Web sites and how to better target their
offerings. Some firms also monitor the Internet usage of their employees to see how they are
using company network resources. Web retailers now have access to software that lets them
watch the online shopping behavior of individuals and groups while they are visiting a Web


Contemporary information systems have severely challenged existing law and social practices
that protect private intellectual property. Intellectual property is considered to be intangible
property created by individuals or corporations.
Information technology has made it difficult to protect intellectual property because
computerized information can be so easily copied or distributed on networks. Intellectual
property is subject to a variety of protections under three different legal traditions: trade secrets,
copyright, and patent law.

Trade Secrets
Any intellectual work producta formula, device, pattern, or compilation of data-used for a
business purpose can be classified as a trade secret, provided it is not based on information in
the public domain. Protections for trade secrets vary from state to state. In general, trade secret
laws grant a monopoly on the ideas behind a work product, but it can be a very tenuous
Software that contains novel or unique elements, procedures, or compilations can be included
as a trade secret. Trade secret law protects the actual ideas in a work product, not only their
manifestation. To make this claim, the creator or owner must take care to bind employees and
customers with nondisclosure agreements and to prevent the secret from falling into the public
The limitation of trade secret protection is that, although virtually all software programs of any
complexity contain unique elements of some sort, it is difficult to prevent the ideas in the work
from falling into the public domain when the software is widely distributed.
Copyright is a statutory grant that protects creators of intellectual property from having their
work copied by others for any purpose during the life of the author plus an additional 70 years
after the authors death. For corporate-owned works, copyright protection lasts for 95 years
after their initial creation.
The intent behind copyright laws has been to encourage creativity and authorship by ensuring
that creative people receive the financial and other benefits of their work. Most industrial
nations have their own copyright laws, and there are several international conventions and
bilateral agreements through which nations coordinate and enforce their laws.
Copyright protects against copying of entire programs or their parts. Damages and relief are
readily obtained for infringement.
The drawback to copyright protection is that the underlying ideas behind a work are not
protected, only their manifestation in a work. A competitor can use your software, understand
how it works, and build new software that follows the same concepts without infringing on a
Look and feel copyright infringement lawsuits are precisely about the distinction between an
idea and its expression.
Case: In the early 1990s Apple Computer sued Microsoft Corporation and Hewlett-Packard for
infringement of the expression of Apples Macintosh interface, claiming that the defendants
copied the expression of overlapping windows. The defendants countered that the idea of
overlapping windows can be expressed only in a single way and, therefore, were not protectable
under the merger doctrine of copyright law. When ideas and their expression merge, the
expression cannot be copyrighted.

A patent grants the owner an exclusive monopoly on the ideas behind an invention for 20 years.
The congressional intent behind patent law was to ensure that inventors of new machines,
devices, or methods receive the full financial and other rewards of their labor and yet still make
widespread use of the invention possible by providing detailed diagrams for those wishing to
use the idea under license from the patents owner. The granting of a patent is determined by
the Patent Office and relies on court rulings.
The key concepts in patent law are originality, novelty, and invention.
The strength of patent protection is that it grants a monopoly on the underlying concepts and
ideas of software. The difficulty is passing stringent criteria of non obviousness (e.g., the work
must reflect some special understanding and contribution), originality, and novelty, as well as
years of waiting to receive protection.


Contemporary information technologies, especially software, pose severe challenges to existing
intellectual property regimes and, therefore, create significant ethical, social, and political
issues. Digital media differ from physical media like books, periodicals, CDs, and newspapers
in terms of ease of replication; ease of transmission; ease of alteration; difficulty in classifying a
software work as a program, book, or even music; compactnessmaking theft easy; and
difficulties in establishing uniqueness.
The proliferation of electronic networks, including the Internet, has made it even more difficult
to protect intellectual property. Before widespread use of networks, copies of software, books,
magazine articles, or films had to be stored on physical media, such as paper, computer disks,
or videotape, creating some hurdles to distribution. Using networks, information can be more
widely reproduced and distributed. A study conducted by the International Data Corporation for
the Business Software Alliance found that more than one-third of the software worldwide was
counterfeit or pirated, and the Business Software Alliance reported $29 billion in yearly losses
from software piracy.
The Internet was designed to transmit information freely around the world, including
copyrighted information. With the World Wide Web in particular, you can easily copy and
distribute virtually anything to thousands and even millions of people around the world, even if
they are using different types of computer systems. Information can be illicitly copied from one
place and distributed through other systems and networks even though these parties do not
willingly participate in the infringement.
Case: Individuals have been illegally copying and distributing digitized MP3 music files on the
Internet for a number of years. File sharing services such as Napster, and later Grokster, Kazaa,
and Morpheus sprung up to help users locate and swap digital music files, including those
protected by copyright. Illegal file-sharing became so widespread that it threatened the viability
of the music recording industry. The recording industry won significant legal battles against
Napster, and later against Grokster and all commercial P2P networks. The U.S. Supreme Court
found in June 2005 that file-sharing networks that intentionally profited from illegal

distribution of music could be held liable for their actions. This decision forced most of the
large-scale commercial P2P networks to shut down, or to seek legal distribution agreements
with the music publishers.
Despite these victories in court, illegal music file sharing abounds on the Internet: 27 percent of
Internet users report downloading music from illegal sites (36 million Americans).
Mechanisms are being developed to sell and distribute books, articles, and other intellectual
property legally on the Internet, and the Digital Millennium Copyright Act (DMCA) of 1998 is
providing some copyright protection. The DMCA implemented a World Intellectual Property
Organization Treaty that makes it illegal to circumvent technology-based protections of
copyrighted materials. Internet service providers (ISPs) are required to take down sites of
copyright infringers that they are hosting once they are notified of the problem.
Microsoft and 1,400 other software and information content firms are represented by the
Software and Information Industry Association (SIIA), which lobbies for new laws and
enforcement of existing laws to protect intellectual property around the world. (SIIA was
formed on January 1, 1999, from the merger of the Software Publishers Association [SPA] and
the Information Industry Association [IIA].) The SIIA runs an antipiracy hotline for individuals
to report piracy activities and educational programs to help organizations combat software
piracy and has published guidelines for employee use of software.


Cultural issues
Adapting to new technologies
Developing trust
Power asymmetry
Policy implementation, etc.
Human Interaction issues
Recruitment and retainment of technical personnel
Social presence, etc.
Relationship issues
Virtual teams
Group cohesiveness
Group facilitation
Buyer-supplier relations, etc.
Security issues

Misuse of data
Virus/worm creation
Internet abuse
Data protection, etc.