Sei sulla pagina 1di 22

Business Intelligence Dictionary Contents

Ad hoc report ..................................................................................................................................... 4 Advanced analytics............................................................................................................................. 4 Base table .......................................................................................................................................... 4 Common dimensions .......................................................................................................................... 4 Compound key or composite key ........................................................................................................ 5 Consolidation (Data)........................................................................................................................... 5 Dashboard ......................................................................................................................................... 5 Data ................................................................................................................................................... 5 Data cleansing - data cleaning - data scrubbing ................................................................................... 5 DI - Data Integration ........................................................................................................................... 6 Data Lineage ...................................................................................................................................... 6 Data mart........................................................................................................................................... 6 Data mining ....................................................................................................................................... 7 Data Vault .......................................................................................................................................... 7 Data warehouse ................................................................................................................................. 7 DB2 Connect ...................................................................................................................................... 7 Denormalize....................................................................................................................................... 8 Data integration ................................................................................................................................. 8 Data Warehouse ................................................................................................................................ 8 DB2 Connect ...................................................................................................................................... 9 Dimension.......................................................................................................................................... 9 Dimension table ................................................................................................................................. 9 EDW - Enterprise Data Warehouse ...................................................................................................... 9 ETL - Extraction, Transformation & Loading ......................................................................................... 9 Fact table ......................................................................................................................................... 10 Financial Disclosure Management ..................................................................................................... 10 Foreign key ...................................................................................................................................... 10 Granularity....................................................................................................................................... 10 IBM Cognos TM1 Connector for SAP BW ........................................................................................... 10 1

IBM Information Analyzer ................................................................................................................ 11 IBM Infosphere Business Glossary..................................................................................................... 11 IBM InfoSphere DataStage................................................................................................................ 11 IBM InfoSphere Metadata Workbench .............................................................................................. 13 In-memory ....................................................................................................................................... 14 KPI or Key Performance Indicator ..................................................................................................... 14 Master data ..................................................................................................................................... 14 Measure or metric ............................................................................................................................ 14 Metadata ......................................................................................................................................... 15 Mobile BI ......................................................................................................................................... 15 Natural key or Business key .............................................................................................................. 15 Normalize ........................................................................................................................................ 15 ODS ................................................................................................................................................. 15 OLAP ................................................................................................................................................ 16 Primary key ...................................................................................................................................... 16 Query............................................................................................................................................... 16 Report bursting ................................................................................................................................ 16 Repository ....................................................................................................................................... 16 SaaS ................................................................................................................................................. 17 SAP Analysis, Edition for MS Office ................................................................................................... 17 SAP Analysis, Edition for OLAP .......................................................................................................... 17 SAP BEX - SAP Business Explorer ....................................................................................................... 17 SAP Business Objects BI launch pad .................................................................................................. 17 SAP Business Objects Dashboards ..................................................................................................... 18 SAP Business Objects Report Comparison Tool .................................................................................. 18 SAP Business Objects Report Conversion Tool ................................................................................... 18 SAP Crystal Reports .......................................................................................................................... 18 SAP HANA - SAP High Performance Analytical Appliance ................................................................... 19 Slice and dice ................................................................................................................................... 19 Slowly changing dimension ............................................................................................................... 19 SMART ............................................................................................................................................. 19 Snowflake schemas .......................................................................................................................... 20 2

Star schemas .................................................................................................................................... 21 Structured data ................................................................................................................................ 21 Surrogate key or technical key .......................................................................................................... 21 Unstructured data ............................................................................................................................ 21

Ad hoc report
An ad hoc report is a report created on the fly by sending a query to the database which was not intentionally created and stored as a repeatable, predefined operational report. These reports cover in general a specific need for information that was not foreseen and will be mostly used just once and are thus created "ad hoc by the BI user.

Advanced analytics
Advanced analytics can be defined as analysis that makes use of mathematical and statistical algorithms. Data mining and BI reporting tools can be used to support advanced analytics. These advanced analytics can be classified in three groups; descriptive analytics (grouping and segmentation), predictive analytics (patterns and anomalies) and optimization analytics (outcome strategies).

Base table
A base table is a synonym for a fact table. It contains data stored at the lowest level of detail. In this way the information can be combined with a multitude of dimensional information.

Common dimensions
When a data model is defined within a data warehouse context it is important to properly define the dimensions. In this way the same dimension can easily be re-used within another context. Common dimensions are standard dimensions, which can be shared among various data models or star schemas. They are dimensions which mean the same thing within every possible fact table to which they can be joined Typically these are dimensions (master data) that appear in different operational sources (ERP, CRM, ) and document essential information elements like customers, products, employees, legal entities etc ...

Compound key or composite key


A compound key uses multiple columns within a table in order to uniquely identify a single record.

Consolidation (Data)
(Data) Consolidation is the concept of storing data, which exists in different systems and in different formats, in a central place and in a uniform way, from where it can be accessed by multiple users and applications.

Dashboard
A dashboard is an easy consumable BI user interface which can represent anything that is interesting to monitor or measure (mostly tactical or operational) in an integrated and visual way. Mostly the visualization of the data is done through interactive dials, sliders, check boxes, gauges, maps, traffic lights and other highly visual components. It records actuals performance and tells us how we are doing. A dashboard is not to be confused with a scorecard.

Data
Data is a gathering of factual and descriptive information which, within the context of data warehousing or business intelligence, is specifically designed to allow easy analysis and reporting of business information in order to support faster business decision making.Data can exist of measurements, words, properties or descriptions of objects relevant to the organisation.

Data cleansing - data cleaning - data scrubbing


The process of cleaning up the data before loading them into a data warehouse by removing errors, incomplete information or inconsistency between sources.

DI - Data Integration
Data integration is the collection of processes and tools, strategies and philosophies by which fragmented data assets are aligned to support business goals. Data integration from a technical point-of-view covers topics such as data federation, virtualization and service-oriented-architecture, but in the context of Performance Management or Business Intelligence, it is usually synonym to ETL (Extract-Transform-Load) or ELT-related activities.

Data Lineage
Data lineage is used to track and manage the data. It answers questions such as :

Where does the data come from? How is the data changed? Where does the data go to?

As such data lineage covers the complete data lifecycle, by showing the data flow from source to target, through its various intermediate states. Ideally data lineage includes both the backend & the front-end part. For obvious reasons the back-end part is included, since this is where is transformed by definition. On the other hand within the front-end part, data can also be logically and / or physically transformed. These transformations can take place in the front-end semantic layers, reports & analysis or even between different front-end tools. As ETL and BI front end tools integrate within single platforms end-to-end data lineage becomes a possibility, which offers great benefits in case of changes to the data warehouse.

Data mart
Data marts are usually smaller then data warehouses as they contain data about a specific subject area and designed to serve a particular community of users from the enterprise. Ideally a data mart is derived from or based on a data warehouse architecture, in order to ensure data consistency throughout the organisation.

Data mining
Data mining or knowledge discovery in a database process (KDD) can be described as the search to discover strategic information and extracting patterns in bigger databases by making use of large datasets. In this process, users are "digging into the data and make use of artificial intelligence and statistical functions. It helps enterprises and scientists to select the essential information out of massive quantities of data and create prediction models.

Data Vault
This is one method in which a database or data warehouse can be designed. Data sourced from different operational systems is stored in such a way that it can be completely traced in term of history. In the data vault architecture you can have three basic entities: hub, link and satellite.A hub consists of a unique list of business keys. These hubs can be joined with several satellites that contain more detailed information about the hub and its keys. All the different hubs can then be connected with the help of the links. A link stores the single unique association between the business keys of the connecting hubs.

Data warehouse
Data Warehouses centrally store enterprise information, with a focus to help making strategic business decisions. The data warehouse usually gets its information out of operational systems (like ERP or CRM Systems) or databases used for daily business operations. The data is ordered and structured in a specific way to facilitate reporting and analysis. Due to its content and the long term storage of historical data, a data warehouse can become a powerful database within the organization.

DB2 Connect
DB2 Connect is a combination of application interfaces (APIs) and network infrastructure components that allow one to make SQL requests on a Windows, Linux, or UNIX machines against databases running on IBM System z (mainframe) and System i (AS/400). In other words, the data resides on the mainframe while application code that uses the data is deployed to distributed systems like a Windows laptop or a server.

Denormalize
This is the process of converting normalized tables again into a de-normalized form. Here a table may contain redundant information. This is a common technique within data warehousing where star schemas are used to optimize performanceA dashboard is an easy consumable BI user interface which can represent anything that is interesting to monitor or measure (mostly tactical or operational) in an integrated and visual way. Mostly the visualization of the data is done through interactive dials, sliders, check boxes, gauges, maps, traffic lights and other highly visual components. It records actuals performance and tells us how we are doing. A dashboard is not to be confused with a scorecard. Data is a gathering of factual and descriptive information which, within the context of data warehousing or business intelligence, is specifically designed to allow easy analysis and reporting of business information in order to support faster business decision making.Data can exist of measurements, words, properties or descriptions of objects relevant to the organisation. The process of cleaning up the data before loading them into a data warehouse by removing errors, incomplete information or inconsistency between sources.

Data integration
Data integration is the collection of processes and tools, strategies and philosophies by which fragmented data assets are aligned to support business goals. Data integration from a technical point-of-view covers topics such as data federation, virtualization and service-oriented-architecture, but in the context of Performance Management or Business Intelligence, it is usually synonym to ETL (Extract-Transform-Load) or ELT-related activities.

Data Warehouse
Data Warehouses centrally store enterprise information, with a focus to help making strategic business decisions. The data warehouse usually gets its information out of operational systems (like ERP or CRM Systems) or databases used for daily business operations. The data is ordered and structured in a specific way to facilitate reporting and analysis. Due to its content and the long term storage of historical data, a data warehouse can become a powerful database within the organization.

DB2 Connect
DB2 Connect is a combination of application interfaces (APIs) and network infrastructure components that allow one to make SQL requests on a Windows, Linux, or UNIX machines against databases running on IBM System z (mainframe) and System i (AS/400). In other words, the data resides on the mainframe while application code that uses the data is deployed to distributed systems like a Windows laptop or a server. This is the process of converting normalized tables again into a de-normalized form. Here a table may contain redundant information. This is a common technique within data warehousing where star schemas are used to optimize performance.

Dimension
A dimension table contains all information for one single entity in context of one or more related fact tables. Within a business question it deals with the "by part. E.g. a customer dimension contains a customer number, name, address, contacts etc.

Dimension table
The attributes in a dimension table describe all the different specific properties or characteristics of this dimension. Most often -in a star schema- the dimension tables are de-normalized.

EDW - Enterprise Data Warehouse


This is a data warehouse which holds the business data of an entire enterprise. The corresponding definitions have been accepted by the entire organization. The enterprise data warehouse is normalized and is source driven.

ETL - Extraction, Transformation & Loading


In order to be able to execute reporting or analysis in a fast manner it is often necessary to present the data in a separate user friendly data model which allows for good performance. An ETL solution is typically used to get the data into such a data model. ETL stands for Extraction, Transformation and Loading. This process extracts raw data from a source system, enriches it with certain logic and loads the transformed data into a new database environment.
9

Fact table
A fact table contains all numeric information for one single business process in context of multiple related dimensions.eg. a sales fact contains number of sold quantities, revenue, margin, etc. The granularity is ideally kept as low as possible in order to enable a variety of reports and analyses.

Financial Disclosure Management


Financial Disclosure Management is a new software category that sits between Enterprise/Corporate Performance Management (EPM/CPM) and GRC (Governance, Risk & Compliance) processes. It automates the process of producing all kind of recurring internal and external reporting (annual report, quarterly report, internal management books, ...) with a focus on combining the data from the CPM solutions (like financial reporting & consolidation packages) with the written out analysis and management comments on the figures and ensures consistency and accuracy of the process across departments and users based on workflow and authorization levels. Soutions in this new market segment include IBM Cognos FSR, SAP Business Objects Disclosure Management, Oracle Hyperion Disclosure Management / Oracle Hyperion Financial Close Management and Tagetik Disclosure Management.

Foreign key
A foreign key is the connecting link between two tables which enables the possibility to use the related information together. In a proper setup, the foreign key is pointing to the primary key of the connected table and it thus ensures the referential integrity of the data.

Granularity
Granularity is the level of detail at which the numerical data is stored in the fact tables within a data warehouse.

IBM Cognos TM1 Connector for SAP BW


The IBM Cognos TM1 Connector for SAP BW is a connector or interface that allows the exchange of data between IBM Cognos TM1 and SAP BW (Business Warehouse) through RFC (remote function calls). The IBM Cognos TM1 Connector for SAP BW enables the transfer of data
10

from a SAP BW data warehouse to a TM1 cube, either via a BEx query or via a direct mapping between a SAP BW InfoCube and a TM1 cube. The connector can also be used to write data from TM1 to a transactional ODS table in SAP BW.

IBM Information Analyzer


IBM Information Analyzer helps you quickly & easily understand data its structures, formats, relationships and quality issues. Information Analyzer provides inherit methodology to analyze large volumes of data, with prebuilt information queries that help you quickly understand how the data is constructed, to pinpoint quality challenges, and to plan integration processes with higher confidence. Additionally, Information Analyzer provides a rule-base engine to constantly analyze & monitor your information quality. IBM Information Analyzer provides:

Low level profiling capabilities to understand data at the column level, tables, keys, and cross-relationships among data assets. Flexible rule-base engine to analyze, monitor and pinpoint quality issues that may affect your business. Native parallel execution to support large quantities of data.

IBM Infosphere Business Glossary


IBM InfoSphere Business Glossary is a software solution providing an enterprise data dictionary. It enables the creation, classification and management of the organizational vocabulary and business definitions to provide better alignment between business and IT users. IBM Business Glossary :

Provides better business understanding of the information by adding the business and technical context to business terms and reports. Accessible from any application, spreadsheet or/and report, allowing immediate context to the information reviewed. Promotes better communication among business users and IT by providing greater awareness on the information sources, representation and stewardship.

IBM InfoSphere DataStage


IBM InfoSphere DataStage is a leading Extraction-Transformation-Loading (ETL) tool, offering parallel processing and integration for high volumes of data sources and target applications.

11

DataStage includes large library of built-in transformations and an internal programing language (BASIC) to support simple-to-highly complex transformation logic to the data, as it goes through it. IBM InfoSphere DataStage:

Supports collection, integration and transformation of data, with data structures ranging from simple to highly complex Equipped with a parallel processing engine DataStage supports massive volumes of data to shorten processing cycles or/and fit into 'night-window' Supports real-time integration Automated documentation enables easier maintenance over time Supports enterprise connectivity

12

IBM InfoSphere Metadata Workbench


IBM InfoSphere Metadata Workbench is a comprehensive metadata solution, unifying data sources, ETL processes, business logic, data quality rules, data models, Business Intelligence reports, and business terminology into a single repository.

By managing the enterprise Information-Assets, IBM Metadata Workbench:

Increases the understanding and trust in the information by allowing end-to-end visual lineage of the information flow including where the information came from and how it was derived. Provides better business understanding of the information as it adds business context and meaning for IT assets. Enables easy 'Search & Find' capabilities on information assets, as it classifies all assets in a single repository, and offers advanced reporting services on its classified items. Provides better Data Governance by enabling risk analysis on the data and dependency management. Increases business and IT collaboration and communications, by sharing a single metadata repository among business and technical users.

13

In-memory
In traditional querying, the tables and cubes that are being consulted are stored on disc. "inmemory solutions load all the data into RAM memory, from where it is being accessed by the application. The advantage is the higher speed of querying and a reduced time of data modeling. Also structures are less predefined and easier to change. The approach obviously requires extra memory hardware requirements. In memory solutions are in the process of replacing OLAP techniques, also due to the lower cost of RAM memory. Popular solutions include QlikView, TM1 and Powerpivot.

KPI or Key Performance Indicator


KPIs are indicators which will be evaluated on a periodically basis in order to understand the organizations performance and take corrective action where needed. It is essential that these KPIs represent the strategic goals of the organization. The KPIs measure the performance compared to certain predefined targets. They can be visualized through BI scorecard solutions. KPIs require clear ownership within an organisation. It is also important that the scorecard solution is integrated with the other BI modules in the platform. Finally it is essential that proper cause & effect diagrams are defined to emphasize the importance of the correlation between the different KPIs.

Master data
Master data is the collection of the most important reference data within an organization. This is mostly limited to a certain set of entities, such as customers, products, financial accounts, suppliers, employees, Next to the central government of this information, attention is given to the ownership and processes which maintain the data and ensure that master data can be used properly and consistently in all of the systems and databases of the organisation.

Measure or metric
A measure is a numeric expression, which can be used and displayed by dimension- on reports and consist of base facts, aggregated values or mathematical functions.

14

Metadata
Metadata is data about data. Metadata is used to describe the properties of data such as dimensions, hierarchies, performance metrics and report layout objectsA distinction can be made between business and technical metadata.

Mobile BI
The ability of delivering analytical information and business metrics over a public network and present it on a functional mobile device (smartphone, netbook) by means of a user friendly graphical interface. There are several advantages of using mobile BI such as the possibility to take earlier decisions due to the fact that you have a real time connection with your data. You can fulfill ad hoc needs during conversations with eg. customers or suppliers. Mobile BI users often use the traditional BI systems even more intensely. Additionally, from security point of view, mobile BI is often safer then when you are using spreadsheets or emailing reports.

Natural key or Business key


The natural key or business key is the key of a table as defined within a source system.

Normalize
Normalizing data is a process defined by E.F. Codd and boils down to structuring the data into separate and multiple tables to avoid redundancy in the data.

ODS
The main reason why an ODS or Operational Data Store is often build is to give users the possibility to execute operational reports. An ODS can be seen as a (partial) copy of the source system. It typically is also then the source for ETL related activities. The granularity is the same as the source system. Most technical data integration solutions have good synchronization methods to sync the ODS with the operational systems. The only issue that can appear is the availability of the data in time or in historical context when directly reporting from the ODS. Managerial reports typically require a higher level grouping, the combination of sources and/or the aggregation of data. This is the reason why an ODS is not perfect for analytical reporting and data warehouses are being build.

15

OLAP
OLAP stands for On Line Analytical Processing. OLAP tools offer the possibility to look at the data in a multi-dimensional way. It gives users the power to analyze data from different angles (dimensions) quickly, as aggregates are often precalculated and stored. Interacting with data through the use of OLAP tools is also known as "slice and dice.

Primary key
A primary key is the key of a table in order to uniquely identify a single record. It can consist of one column or a combination of columns.

Query
A query is a question to a database, translated in a SQL-language which results in a particular data set being returned.

Report bursting
Report bursting is the possibility to deliver a single report to multiple destinations simultaneously. During the refresh of the report you make use of only one simple data fetch and deliver it to the corresponding destinations depending on a changing parameter, often also linked to certain security settings. As an example, a monthly sales report which is send to 50 account managers, who each time only get the sales to their customers in the report they receive.

Repository
A repository is a collection of database tables in which information about something is stored in a central place. BI tools like IBM Cognos or SAP BusinessObjects for example store security information like user and user group data, access permissions, documents and meta data information in such a repository, stored in a separate RDBMS.

16

SaaS
Software as a Service is an online service of software over the internet. The software is centrally hosted by a Saas provider and is accessible by users having a web browser. The user is hiring the services while the SaaS provider is responsible for installation, maintenance and upgrades of the software. The solution is typically available in a renting model. Increasingly also Business Intelligence and Corporate Performance Management solutions become available in a Saas Model.

SAP Analysis, Edition for MS Office


"SAP Analysis, Edition for MS Office is the new name of the Bex Analyzer product as part of the SAP Business Objects 4.0 release. This solution allows SAP Netweaver Business Warehouse users to perform analysis of BW data in Microsoft Excel and Microsoft Office (Powerpoint)

SAP Analysis, Edition for OLAP


Formerly known as SAP Business Objects Voyager, "SAP Analysis, edition for OLAP, finally brings OLAP to Business Objects. It is a key deliverable of the SAP Business Objects 4.0 release and crucial in unlocking data residing in SAP BW InfoCubes.With this OLAP tool, key users can create analyses on multidimensional data by the use of a web interface.

SAP BEX - SAP Business Explorer


SAP BEX or SAP Business Explorer is the Microsoft Excel add-in reporting tool used to query and work with data from the SAP Business Information Warehouse or shorter the BW (Business Warehouse).

SAP Business Objects BI launch pad


The BI launch pad is part of the SAP BusinessObjects 4.0 release and is the new name for the former BO Infoview portal. It makes use of your web browser and opens the way to use different deployed objects created with the SAP BusinessObjects Enterprise components such as Web Intelligence and Crystal Reports documents.
17

SAP Business Objects Dashboards


SAP BusinessObjects Dashboards is the new name, in the SAP BusinessObjects 4.0 release, of SAP BO Xcelsius. It is a powerful dashboarding & visualization tool.Complex business data can be presented in an easy way with the use of interactive gauges, charts, sliders and widgets. Decision makers can browse through the data and test future scenarios with basic "what if functionality.

SAP Business Objects Report Comparison Tool


This is a standalone application used to compare a Business Objects report created in an earlier version, with a converted or migrated Business Objects report in a newer version. It uses color coding to alert the user about the detected changes. Also sometimes referred to as "Business 0bjects delta viewer".

SAP Business Objects Report Conversion Tool


This tool can be used to convert Business Objects DeskI reports (Desktop Intelligence) into Business Objects WebI reports (Web Intelligence). The original and converted reports need to be stored in the repository, known as the Central Management Server or CMS. This conversion does not always run smoothly and depending on the success, there are three levels of conversion (fully, partial and not converted) depending on the used features in the original report.

SAP Crystal Reports


SAP Crystal Reports is a reporting solution that gives the user the possibility to create professional reports based upon multiple data sources which can be accessed by amongst others ODBC data connections. The reports can be exported to all known popular data formats and can be interactive integrated into portals and Microsoft Office documents by the use of applications like .NET, Java and COM applications. Crystal Reports focuses more on so called "operational reporting and in its origin remains a report writer, where Webintelligence is more end-user oriented BI.

18

SAP HANA - SAP High Performance Analytical Appliance


HANA stands for High Performance Analytical Appliance and is an SAP in-memory, columnbased appliance to optimise performance for Business Intelligence & data warehousing related applications. This product, originally code-named New DB, is a follow-on from the SAP Business Warehouse Accelerator. Also the R&D capabilities of Sybase, through its Sybase IQ solution -which is also column-based- contributed to the solution.

Slice and dice


The act of browsing the data using all different combinations of dimensions with the help of an OLAP tool in order to see the corresponding values or measures. Slowly changing dimension Slowly changing dimensions are a set of data warehouse techniques to deal with history within a dimension. Dimensions are related to facts and while the facts are basically always transaction oriented with an associated date, a dimension is not. Therefore a dimension may require history as well. 3 basic types exist:

Type 1, for which no history is kept, Type 2, for which history is kept for certain columns based on all changes at any time in a source system for individual records and Type 3, for which history is kept for a very limited set of columns based on changes at a certain point in time for a bulk set of records.

SMART
Defining goals you want to reach is easier said than done. Checking goals against the SMART criteria can help in assessing whether the defined goals can be realised. SMART is an acronym for - Specific- Measurable- Achievable- Realistic- Timely.

Specific : objectives should be clearly and well defined. This definition can exist out of a scope description, stating what is included, how things should be done and why we are doing it.

19

Measurable; to be able to determine if the goals are reached, the goals must be sufficiently measurable. Examples of what can be measured are eg. quantity, quality, time and cost. Achievable; the objectives need to be accepted by the employee and his team. It is important not to define goals that are considered as not being achievable. Realistic: "can it be done ?. When the goal is not realistic, it works demotivating for the employees Timely; the goal should have a timeline during which the objective is supposed to be realized.

Snowflake schemas
Two "dimensional modeling techniques are generally used for modeling the data structures within a data warehouse context. These are the "star schema and "snowflake schema techniques. Both find their roots within the Kimball dimensional modeling techniques. Properties of the snowflake schemas :

Hierarchical structure can be visualized some tools dont support snowflakes destroys "browsing speed and flexibility more tables less storage possibly slower generally to be avoided in the context of end-users queries.

20

Star schemas
Two "dimensional modeling techniques are generally used for modeling the data structures within a data warehouse context. These are the "star schema and "snowflake schema techniques. Both find their roots within the Kimball dimensional modeling techniques. A star schema is the preferred technique within the Kimball approach. Properties of star schemas :

are easier to navigate improves query performance eases the "browsing of dimensionsis more widely used and the dimensions are fully de-normalized.

Structured data
Structured data consists of any data stored in a structured format. That is, in the structured content, there is a conceptual definition and data type definition behind the information.The data has known relationships and predefined data types. This structure can be used by technologies without human intervention and allow us to read, query, analyze and understand the information. Nowadays, 95% of data handled in a BI context are structured.

Surrogate key or technical key


A surrogate key or technical key is an artificially created numerical key without any business meaning. These keys are used within a data warehouse environment in order to easily combine tables.

Unstructured data
Unstructured data are non- structured data (see also "Structured data). People are using unstructured data every day; examples of unstructured data are emails, webpages, documents, images and spreadsheets.This kind of data does not have a structured format, it has no conceptual or data type definition. Human intervention is needed to make this unstructured data ready for technological

21

processing. Within a BI context unstructured data is more and more considered to be a potential source for the data warehouse and thus also for being manipulated in the ETL layer.

22

Potrebbero piacerti anche