Sei sulla pagina 1di 18

Q1

Describe key system strategies used by organizations to strengthen their competitive position
Although many managers are familiar with the reasons why managing their typical resources
such as equipment and people are important, it is worthwhile to take a moment to examine the
growing interdependence between a firms ability to use information technology and its ability to
implement corporate strategies and achieve corporate goals. Specifically, business firms invest
heavily in information to achieve six strategic business objectives:
a. Operational excellence
b. New products, services, and business models
c. Customer and supplier intimacy
d. Improved decision making
e. Competitive advantage
f. Survival
Operational Excellence
Businesses continuously seek to improve the efficiency of their operations in order to achieve
higher profitability. Information systems and technologies are some of the most important tools
available to managers for achieving higher levels of efficiency and productivity in business
operations, especially when coupled with changes in business practices and management
behavior.
New Products, Services, and Business Models
Information systems and technologies are a major enabling tool for firms to create new products
and services, as well as entirely new business models. A business model describes how a
company produces, delivers, and sells a product or service to create wealth. As successful as
Apple Inc, NetFlix, and Wal-Mart were in their traditional brick-and-mortar existence, they have
all introduced new products, services, and business models that have made them both
competitive and profitable.
Customer and Supplier Intimacy
When a business really knows its customers, and serves them well, the way they want to be
served, the customers generally respond by returning and purchasing more. The result is
increased revenues and profits. Likewise with suppliers: the more a business engages its
suppliers, the better the suppliers can provide vital inputs. The result is a lower cost of doing

business. JC Penney is an excellent example of how the use of information systems and
technologies are extensively used to better serve suppliers and retail customers. Its information
system digitally links the supplier to each of its stores worldwide. Suppliers are able to ensure
the continuous flow of products to the stores in order to satisfy customer demands.
Improved Decision Making
Information systems and technologies have made it possible for managers to use real-time data
from the marketplace when making decisions. Previously, managers did not have access to
accurate and current data and as such relied on forecasts, best guesses, and luck. The inability to
make informed decision resulted in increasing costs and losing customers.
Competitive Advantage
Doing things better than your competitors, charging less for superior products, and responding to
customers and suppliers in real time all add up to higher sales and higher profits that your
competitors cannot match. Toyota and Wal-Mart are prime examples of how companies use
information systems and technologies to separate themselves from their competition. Toyota
worked its way to top of its industry with the help of its legendary information system. Wal-Mart
is the most efficient retail store in the industry based in large part on how well it uses its
information resources.
Survival
Firms also invest in information systems and technologies because they are necessities of doing
business. Information systems are not a luxury. In most businesses, information systems and
technology are the core to survival. In doing so, they had a major competitive advantage over
their competitors. In order to remain and survive in the retail banking industry, other banks had
no choice but to provide ATM services to banking customers.

Q2
Discuss the various emergent information technologies and software
The Emerging Digital Firm
A digital firm is one in which nearly all of the organizations significant business relationships
with customers, suppliers, and employees are digitally enabled, and key corporate assets are
managed through digital means.
When a firm goes digital, its not about just adding a computer system to the mix. Throwing a
computer system at outdated business processes is exactly the wrong thing to do. A truly digital
firm has several characteristics that distinguish it from most of the firms claiming to be digitized:
i. Significant business relationships with customers, suppliers, and employees are digitally
enabled and mediated.
ii. Core business processes are accomplished through digital networks and span the entire
organization or link multiple organizations.
iii. Key corporate assets intellectual property, core competencies, and financial and human
assets are managed through digital means.
iv. Internal and external environments are quickly recognized and dealt with.
Growth of online software as a service
Software as a Service (SaaS) It is clear that software will increasingly be delivered and used
over networks as a service.Software is delivered as a service over the Internet. In addition to free
or low-cost tools for individuals and small businesses provided by Google or Yahoo!, enterprise
software and other complex business functions are available as services from the major
commercial software vendors. Instead of buying and installing software programs, subscribing
companies rent the same functions from these services, with users paying on either a subscription
or per transaction basis. Services for delivering and providing access to software remotely as a
Web-based service are now referred to as Software as a Service (SaaS). A leading example is
Salesforce.com, which provides on-demand software services for customer relationship
management, including sales force automation, partner relationship management, marketing, and
customer service. It includes tools for customization, integrating its software with other corporate
applications, and integrating new applications to run other parts of the business. The Window on
Organizations provides more detail on these capabilities. In some cases, the cost of renting
software will add up to more than purchasing and maintaining the application in-house. Yet there
may be benefits to paying more for software as a service if this decision allows the company to
focus on core business issues instead of technology challenges.

Growth of cloud computing


Cloud computing and storage solutions provide users and enterprises with various capabilities to
store and process their data in third-party data centers. It relies on sharing of resources to achieve
coherence and economies of scale, similar to a utility (like the electricity grid) over a network. At
the foundation of cloud computing is the broader concept of converged infrastructure and shared
services.
Proponents claim that cloud computing allows companies to avoid upfront infrastructure costs,
and focus on projects that differentiate their businesses instead of on infrastructure.Proponents
also claim that cloud computing allows enterprises to get their applications up and running faster,
with improved manageability and less maintenance, and enables IT to more rapidly adjust
resources to meet fluctuating and unpredictable business demand. Cloud providers typically use
a "pay as you go" model. This can lead to unexpectedly high charges if administrators do not
adapt to the cloud pricing model
The present availability of high-capacity networks, low-cost computers and storage devices as
well as the widespread adoption of hardware virtualization, service-oriented architecture, and
autonomic and utility computing have led to a growth in cloud computing. Companies can scale
up as computing needs increase and then scale down again as demands decrease.
Cloud vendors are experiencing growth rates of 50% per annum

Mashups and Widgets


In the past, software such as Microsoft Word or Adobe Illustrator came in a box and was
designed to operate on a single machine. Increasingly, software is downloadable from the
Internet and composed of interchangeable components that integrate freely with other
applications on the Internet. Individual users and entire companies mix and match these software
components to create their own customized applications and to share information with others.
The resulting software applications are called mashups. The idea is to take software from
different sources and combine it in order to produce a application that is greater than the sum
of its parts. Part of the movement called Web 2. and in the spirit of musical mashups, Web
mashups combine the capabilities of two or more online applications to create a kind of hybrid
that provides more customer value than the original sources alone. One area of great innovation
is the mashup of mapping and satellite image software with local content. For instance, Zoocasa

is a new real estate search engine in Canada that is using Google Maps to display real estate
listings
Smart virtual personal assistants: SVPAs started entering the market in 2013. At the time, they
used semantic and natural language processing; data mined from our calendars, email, and
contact lists; and the last few minutes of our behavior to anticipate the next 10 seconds of our
thinking. Most of those original apps have now been acquired. Emu was acquired by Google,
Donna was acquired by Yahoo, Cue was acquired by Appleand the list goes on. When it was
still active, Emu was a clever stand-in for a personal secretary. It would monitor the conversation
and automatically make suggestions as two people texted. For example, if you asked your friend
to see a movie, Emu would immediately geolocate both of you, suggest a nearby theater and
show films and times, then check your calendars for your availability. It would even display a
preview for you to watch. Once it determined the best time for you to meet, it would help you
purchase tickets and enter all the data into your calendar. And it did all of this inside a single
mobile application. In 2015, consumers will begin to see SVPA technology baked into their
mobile phones. For example, Google is quietly starting to release a new SVPA function for
Android users: it automatically detects when youve parked your car, marks your parking spot
for you on a Google map, and helps get you back to it once youre ready to start driving again.
All without you explicitly asking it to do so. Marketers, credit card companies, banks, local
government agencies, political campaigns, and many others can harness SVPAs to both deliver
critical

information and

to

better

read

and

understand

constituents.

Screenless Display (3D glasses)


This field saw rapid progress in 2013 and appears set for imminent breakthroughs of scalable
deployment of screenless display. Various companies have made significant breakthroughs in the
field, including virtual reality headsets, bionic contact lenses, the development of mobile phones
for the elderly and partially blind people, and hologram-like videos without the need for
moving parts or glasses."

Q3 Analyze the key ethical and social issues relating to information systems
There are many unique challenges we face in this age of information. They stem from the nature
of information itself. Information is the means through which the minds expands and increases
its capacity to achieve its goals, often as the result of an input from another mind. Thus,
information forms the intellectual capital from which human beings craft their lives and secure
dignity.
However, the building of intellectual capital is vulnerable in many ways. For example, people's
intellectual capital is impaired whenever they lose their personal information without being
compensated for it, when they are precluded access to information which is of value to them,
when they have revealed information they hold intimate, or when they find out that the
information upon which their living depends is in error. The social contract among people in the
information age must deal with these threats to human dignity. The ethical issues involved are
many and varied, however, it is helpful to focus on just four. These may be summarized by
means of an acronym -- PAPA.
Privacy: What information about one's self or one's associations must a person reveal to
others, under what conditions and with what safeguards? What things can people keep to
themselves and not be forced to reveal to others?
Accuracy: Who is responsible for the authenticity, fidelity and accuracy of information?
Similarly, who is to be held accountable for errors in information and how is the injured party
to be made whole?
Property: Who owns information? What are the just and fair prices for its exchange? Who
owns the channels, especially the airways, through which information is transmitted? How
should access to this scarce resource be allocated?
Accessibility: What information does a person or an organization have a right or a privilege
to obtain, under what conditions and with what safeguards?
1. Privacy
What information should one be required to divulge about one's self to others? Under what
conditions? What information should one be able to keep strictly to one's self? These are among
the questions that a concern for privacy raises. Today more than ever cautious citizens must be
asking these questions.

Two forces threaten our privacy. One is the growth of information technology, with its enhanced
capacity for surveillance, communication, computation, storage, and retrieval. A second, and
more insidious threat, is the increased value of information in decision-making. Information is
increasingly valuable to policy makers; they covet it even if acquiring it invades another's
privacy.

2. Accuracy
Misinformation has a way of fouling up people's lives, especially when the party with the
inaccurate information has an advantage in power and authority. So it is our responsibility to be
vigilant in the pursuit of accuracy in information. Today we are producing so much information
about so many people and their activities that our exposure to problems of inaccuracy is
enormous. And this growth in information also raises another issue: Who owns it?
A special burden is placed on the accuracy of information when people rely on it for matters of
life and death, as we increasingly do. This came to light in a recent $3.2 million lawsuit charging
the National Weather Service for failing to predict accurately a storm that raged on the southeast
slope of Georges Bank in 1980. The source of the fatal error was failure of a large scale
information system which collects data from high atmosphere balloons, satellites, ships, and a
series of buoys. This data is then transmitted to a National Oceanographic and Atmospheric
Administration computer which analyzes it and produces forecasts. The forecasts, in turn, are
broadcast widely.

The affected families received a large financial settlement; but can they ever be repaid for the
irreparable harm done to them and to their dignity.They received a judgment in their case; but
have they been repaid for the loss their loved ones? The point is this: We run the risk of creating
chaos every time we design information systems and place information in databases which might
be used to make decisions
3. Property
One of the most complex issues we face as a society is the question of intellectual property
rights. There are substantial economic and ethical concerns surrounding these rights; concerns
revolving around the special attributes of information itself and the means by which it is
transmitted. Any individual item of information can be extremely costly to produce in the first
instance. Yet, once it is produced, that information has the illusive quality of being easy to
reproduce and to share with others. Moreover, this replication can take place without destroying
the original. This makes information hard to safeguard since, unlike tangible property, it becomes

communicable and hard to keep it to one's self. It is even difficult to secure appropriate
reimbursements when somebody else uses your information.
We currently have several imperfect institutions that try to protect intellectual property rights.
Copyrights, patents, encryption, oaths of confidentiality, and such old fashioned values as trust
worthiness and loyalty are the most commonly used protectors of our intellectual property.
Problem issues, however, still abound in this area. Let us focus on just one aspect: artificial
intelligence and its expanding subfield, expert systems.

4. Access
Our main avenue to information is through literacy. Literacy, since about 1500 A.D. when the
Syrians first conceived a consonant alphabet, has been a requirement for full participation in the
fabric of society. Each innovation in information handling, from the invention of paper to the
modern computer, has placed new demands on achieving literacy. In an information society a
citizen must possess at least three things to be literate:
- One must have the intellectual skills to deal with information. These are skill such as reading,
writing, reasoning, and calculating. This is a task for education.
- One must have access to the information technologies which store, convey and process
information. This includes libraries, radios, televisions, telephones, and increasingly, personal
computers or terminals linked via networks to mainframes. This is a problem in social
economics.
- Finally, one must have access to the information itself. This requirement returns to the issue of
property and is also a problem in social economics.
These requirements for literacy are a functioon of both the knowledge level and the economic
level of the individual. Unfortunately, for many people in the world today both of these levels are
currently deteriorating.
Reflect for a moment on the social effects of electronically stored databases. Prior to their
invention, vast quantities of data about publications, news events, economic and social statistics,
and scientific findings have been available in printed, microfilm, or microfiches form at a
relatively low cost. For most of us access to this data has been substantially free. We merely went
to our public or school library. The library, in turn, paid a few hundred dollars for the service and
made it available to whomever asked for it. Today, however, much of this information is being
converted to computerized databases and the cost to access these databases can run in the
thousands of dollars.

Q4
How is cloud computing applicable to Enterprise Architecture & IT Infrastructure
Systems Analysis
So what's the problem? Answering that question is harder than you might think. You have to
analyze the current situation to determine the real cause of the problem. Make sure you're
addressing the real problem and not just the symptoms. Effective systems analysis, adequately
determining the real problem, is the key.
Write down everything you do in this stage, especially when it comes to what the real problem or
opportunity is. Constantly review it throughout the rest of the system development process to
remind you and others of what you're trying to do and where you're trying to go. It's natural to
stray from the path! Most of all, determine how your objective fits in with the rest of the current
information systems and the business plan itself.
Is your idea even feasible? You might be surprised how often organizations fail to ask this
question. A feasibility study helps you determine if your proposed solution is achievable before
you spend thousands of dollars. The study will review the technical, financial, and organizational
feasibility of hardware, software, and persware and help you decide whether your proposed
answer is the right one. Too often organizations underestimate the cost of a new system,
especially in the persware area: training, downtime, lost productivity, and employee disruption.

Establishing Information Requirements


Figuring out who needs what information, where, when, and how will challenge the political
dynamics of any organization. No system can answer every need, so you're going to have to
decide who gets what. That's why you must write down the problem and then keep referring to
your notes throughout the development process. It is too easy to get sidetracked by politics.
You must think and then rethink the proposed solution. Make sure you thoroughly investigate the
Information requirements you're going to live or die by the outcome. Whatever happens at
this stage will carry through to all the other stages. A significant cause of system failure in
development projects is because organizations failed to properly analyze and understand their
information requirements.
The final dilemma is whether a new information system is really the answer. Would it be better to
address the problem through management changes, more training, or changing existing
organizational processes?
Systems Design
Congratulations! If you get to the systems design stage, it means you managed to live through
the analysis phase. Now you can get down to figuring out how the system will actually solve the
problem or help you take advantage of new opportunities. Remember, your goal is to fit the
system into the organization and not make the organization fit the new system. Or at least you

want to keep them in tandem; that is, the organization should decide what technology is
necessary, while the system capabilities can help reshape the organization.
When we discussed database management systems, we distinguished between two methods of
viewing data: the physical design (how the data would actually be stored in the system) and the
logical design (how the data would look to the user). Use the same definitions when you are
designing your system, and concentrate on the logical design. In addition to elements that the
authors point out in the text, the physical design should determine how the new system will
support the current organizational structure, or spell out the changes in that structure that will
successfully integrate the new system.
The Role of End Users
Unfortunately, the physical design sometimes overrides the logical design. Why? Because the
non techies give up too much control to the techies. This is a reminder that both sides have to
work together, keeping the goals of the organization as the number one priority, and
remembering that the best system is one that meets the users needs.
Don't forget that people are the most important component of any system. As soon as users begin
to feel they have little input into the development process, you are courting disaster. Keeping the
end user involved will produce a better system. The number one reason so many system
development projects fail is due to insufficient user involvement.

Q5
With examples explain the concept of Business Intelligence
Business intelligence (BI) is the set of techniques and tools for the transformation of raw data
into meaningful and useful information for business analysis purposes. BI technologies are
capable of handling large amounts of unstructured data to help identify, develop and otherwise
create new strategic business opportunities. The goal of BI is to allow for the easy interpretation
of these large volumes of data. Identifying new opportunities and implementing an effective
strategy based on insights can provide businesses with a competitive market advantage and longterm stability.
BI technologies provide historical, current and predictive views of business operations. Common
functions
of
business
intelligence
technologies
are reporting, online
analytical
processing, analytics, data mining, process mining, complex event processing, business
performance management, benchmarking, text mining, predictive analytics andprescriptive
analytics.
BI can be used to support a wide range of business decisions ranging from operational to
strategic. Basic operating decisions include product positioning or pricing. Strategic business
decisions include priorities, goals and directions at the broadest level. In all cases, BI is most
effective when it combines data derived from the market in which a company operates (external
data) with data from company sources internal to the business such as financial and operations
data (internal data). When combined, external and internal data can provide a more complete
picture which, in effect, creates an "intelligence" that cannot be derived by any singular set of
data.
The term "Business Intelligence" was originally coined by Richard Millar Devens in the
Cyclopdia of Commercial and Business Anecdotes from 1865. Devens used the term to
describe how the banker, Sir Henry Furnese, gained profit by receiving and acting upon
information about his environment, prior to his competitors. Throughout Holland, Flanders,
France, and Germany, he maintained a complete and perfect train of business intelligence. The
news of the many battles fought was thus received first by him, and the fall of Namur added to
his profits, owing to his early receipt of the news. (Devens, (1865),). The ability to collect and

react accordingly based on the information retrieved, an ability that Furnese excelled in, is today
still at the very heart of BI

Applications in an enterprise
Business intelligence can be applied to the following business purposes, in order to drive
business value measurement program that creates a hierarchy of performance metrics (see
also Metrics Reference Model) and benchmarking that informs business leaders about progress
towards business goals (business process management).
1. Analytics program that builds quantitative processes for a business to arrive at optimal
decisions and to perform business knowledge discovery. Frequently involves: data
mining, process mining, statistical analysis , predictive analytics, predictive
modeling, business process modeling, data lineage, complex event processing
and prescriptive analytics.
2. Reporting/enterprise reporting program that builds infrastructure for strategic reporting
to serve the strategic management of a business, not operational reporting. Frequently
involves data visualization, executive information system and OLAP.
3. Collaboration/collaboration platform program that gets different areas (both inside and
outside the business) to work together through data sharing and electronic data
interchange.
4. Knowledge management program to make the company data-driven through strategies
and practices to identify, create, represent, distribute, and enable adoption of insights and
experiences that are true business knowledge. Knowledge management leads to learning
management and regulatory compliance.
In addition to the above, business intelligence can provide a pro-active approach, such as alert
functionality that immediately notifies the end-user if certain conditions are met. For example, if
some business metric exceeds a pre-defined threshold, the metric will be highlighted in standard
reports, and the business analyst may be alerted via e-mail or another monitoring service. This
end-to-end process requires data governance, which should be handled by the expert.

According to Kimball et al., there are three critical areas that organizations should assess before
getting ready to do a BI project:
1. The level of commitment and sponsorship of the project from senior management
2. The level of business need for creating a BI implementation
3. The amount and quality of business data available.

Q6
Infrastructure as a service (IaaS
In the most basic cloud-service model & according to the IETF (Internet Engineering Task
Force), providers of IaaS offer computers physical or (more often) virtual machines and other
resources. (A hypervisor, such as Xen, Oracle VirtualBox, KVM, VMware ESX/ESXi, or HyperV runs the virtual machines as guests. Pools of hypervisors within the cloud operational supportsystem can support large numbers of virtual machines and the ability to scale services up and
down according to customers' varying requirements.) IaaS clouds often offer additional resources
such as a virtual-machine disk image library, raw block storage, and file orobject storage,
firewalls, load balancers, IP addresses, virtual local area networks (VLANs), and software
bundles. IaaS-cloud providers supply these resources on-demand from their large pools installed
in data centers. For wide-areaconnectivity, customers can use either the Internet or carrier
clouds (dedicated virtual private networks).
To deploy their applications, cloud users install operating-system images and their application
software on the cloud infrastructure. In this model, the cloud user patches and maintains the
operating systems and the application software. Cloud providers typically bill IaaS services on a
utility computing basis: cost reflects the amount of resources allocated and consumed.
Platform as a service (PaaS)
In the PaaS models, cloud providers deliver a computing platform, typically including operating
system, programming language execution environment, database, and web server. Application
developers can develop and run their software solutions on a cloud platform without the cost and
complexity of buying and managing the underlying hardware and software layers. With some

PaaS offers like Microsoft Azure and Google App Engine, the underlying computer and storage
resources scale automatically to match application demand so that the cloud user does not have
to allocate resources manually. The latter has also been proposed by an architecture aiming to
facilitate real-time in cloud environments. Even more specific application types can be provided
via PaaS, e.g., such as media encoding as provided by services as bitcodin transcoding cloud or
media.io.
Software as a service (SaaS)
In the business model using software as a service (SaaS), users are provided access to application
software and databases. Cloud providers manage the infrastructure and platforms that run the
applications. SaaS is sometimes referred to as "on-demand software" and is usually priced on a
pay-per-use basis or using a subscription fee.
In the SaaS model, cloud providers install and operate application software in the cloud and
cloud users access the software from cloud clients. Cloud users do not manage the cloud
infrastructure and platform where the application runs. This eliminates the need to install and run
the application on the cloud user's own computers, which simplifies maintenance and support.
Cloud applications are different from other applications in their scalabilitywhich can be
achieved by cloning tasks onto multiple virtual machines at run-time to meet changing work
demand. Load balancers distribute the work over the set of virtual machines. This process is
transparent to the cloud user, who sees only a single access point. To accommodate a large
number of cloud users, cloud applications can be multitenant, that is, any machine serves more
than one cloud user organization.
The pricing model for SaaS applications is typically a monthly or yearly flat fee per user, so
price is scalable and adjustable if users are added or removed at any point.
Proponents claim SaaS allows a business the potential to reduce IT operational costs by
outsourcing hardware and software maintenance and support to the cloud provider. This enables
the business to reallocate IT operations costs away from hardware/software spending and
personnel expenses, towards meeting other goals. In addition, with applications hosted centrally,
updates can be released without the need for users to install new software. One drawback of
SaaS is that the users' data are stored on the cloud provider's server. As a result, there could be
unauthorized access to the data. For this reason, users are increasingly adopting intelligent thirdparty key management systems to help secure their data.

Cloud clients
Users access cloud computing using networked client devices, such as desktop
computers, laptops, tablets and smartphones and any Ethernet enabled device such as Home
Automation Gadgets. Some of these devices cloud clients rely on cloud computing for all or
a majority of their applications so as to be essentially useless without it. Examples are thin
clients and the browser-based Chromebook. Many cloud applications do not require specific
software on the client and instead use a web browser to interact with the cloud application.
With Ajax and HTML5 these Web user interfaces can achieve a similar, or even better, look and
feel to native applications. Some cloud applications, however, support specific client software
dedicated to these applications (e.g., virtual desktop clients and most email clients). Some legacy
applications (line of business applications that until now have been prevalent in thin client
computing) are delivered via a screen-sharing technology.

Private cloud
Private cloud is cloud infrastructure operated solely for a single organization, whether managed
internally or by a third-party, and hosted either internally or externally. Undertaking a private
cloud project requires a significant level and degree of engagement to virtualize the business
environment, and requires the organization to reevaluate decisions about existing resources.
When done right, it can improve business, but every step in the project raises security issues that
must be addressed to prevent serious vulnerabilities. Self-run data centers are generally capital
intensive. They have a significant physical footprint, requiring allocations of space, hardware,
and environmental controls. These assets have to be refreshed periodically, resulting in additional
capital expenditures. They have attracted criticism because users "still have to buy, build, and
manage them" and thus do not benefit from less hands-on management, essentially "[lacking] the
economic model that makes cloud computing such an intriguing concept".
Public cloud
A cloud is called a "public cloud" when the services are rendered over a network that is open for
public use. Public cloud services may be free. Technically there may be little or no difference
between public and private cloud architecture, however, security consideration may be
substantially different for services (applications, storage, and other resources) that are made
available by a service provider for a public audience and when communication is effected over a
non-trusted network. Generally, public cloud service providers like Amazon AWS, Microsoft and
Google own and operate the infrastructure at their data center and access is generally via the

Internet. AWS and Microsoft also offer direct connect services called "AWS Direct Connect" and
"Azure ExpressRoute" respectively, such connections require customers to purchase or lease a
private connection to a peering point offered by the cloud provider.
Hybrid cloud
Hybrid cloud is a composition of two or more clouds (private, community or public) that remain
distinct entities but are bound together, offering the benefits of multiple deployment models.
Hybrid cloud can also mean the ability to connect collocation, managed and/or dedicated
services with cloud resources.
Gartner, Inc. defines a hybrid cloud service as a cloud computing service that is composed of
some combination of private, public and community cloud services, from different service
providers. A hybrid cloud service crosses isolation and provider boundaries so that it can't be
simply put in one category of private, public, or community cloud service. It allows one to
extend either the capacity or the capability of a cloud service, by aggregation, integration or
customization with another cloud service.
Varied use cases for hybrid cloud composition exist. For example, an organization may store
sensitive client data in house on a private cloud application, but interconnect that application to a
business intelligence application provided on a public cloud as a software service. This example
of hybrid cloud extends the capabilities of the enterprise to deliver a specific business service
through the addition of externally available public cloud services. Hybrid cloud adoption
depends on a number of factors such as data security and compliance requirements, level of
control needed over data, and the applications an organization uses.
Another example of hybrid cloud is one where IT organizations use public cloud computing
resources to meet temporary capacity needs that cannot be met by the private cloud. [71] This
capability enables hybrid clouds to employ cloud bursting for scaling across clouds. [3] Cloud
bursting is an application deployment model in which an application runs in a private cloud or
data center and "bursts" to a public cloud when the demand for computing capacity increases. A
primary advantage of cloud bursting and a hybrid cloud model is that an organization only pays
for extra compute resources when they are needed. Cloud bursting enables data centers to create
an in-house IT infrastructure that supports average workloads, and use cloud resources from
public or private clouds, during spikes in processing demands.
The specialized model of hybrid cloud, which is built atop heterogeneous hardware, is called
"Cross-platform Hybrid Cloud". A cross-platform hybrid cloud is usually powered by different
CPU architectures, for example, x86-64 and ARM, underneath. Users can transparently deploy

applications without knowledge of the cloud's hardware diversity. This kind of cloud emerges
from the raise of ARM-based system-on-chip for server-class computing.
Community cloud
Community cloud shares infrastructure between several organizations from a specific community
with common concerns (security, compliance, jurisdiction, etc.), whether managed internally or
by a third-party, and either hosted internally or externally. The costs are spread over fewer users
than a public cloud (but more than a private cloud), so only some of the cost savings potential of
cloud computing are realized.
Distributed cloud
A cloud computing platform can be assembled from a distributed set of machines in different
locations, connected to a single network or hub service. It is possible to distinguish between two
types of distributed clouds: public-resource computing and volunteer cloud.

Public-resource computing: This type of distributed cloud results from an expansive


definition of cloud computing, because they are more akin to distributed computing than
cloud computing. Nonetheless, it is considered a sub-class of cloud computing, and some
examples include distributed computing platforms such as BOINC and Folding@Home.

Volunteer cloud: Volunteer cloud computing is characterized as the intersection of


public-resource computing and cloud computing, where a cloud computing infrastructure is
built using volunteered resources. Many challenges arise from this type of infrastructure,
because of the volatility of the resources used to build it and the dynamic environment it
operates in. It can also be called peer-to-peer clouds, or ad-hoc clouds. An interesting effort
in such direction is Cloud@Home, it aims to implement a cloud computing infrastructure
using volunteered resources providing a business-model to incentivize contributions through
financial restitution

Intercloud
The Intercloud is an interconnected global "cloud of clouds" and an extension of the Internet
"network of networks" on which it is based. The focus is on direct interoperability between
public cloud service providers, more so than between providers and consumers (as is the case for
hybrid- and multi-cloud).

References
1. https://hbr.org/2015/01/the-tech-trends-you-cant-ignore-in-2015
2. Solzhenitsyn, Aleksandr I., The Cancer Ward, Dial Press, New York, New York, 1968.
3. U.S. House of Representatives, The Computer and Invasion of Privacy, U.S. Government
Printing Office, Washington, D.C., 1966.
4. https://en.wikipedia.org/wiki

Potrebbero piacerti anche