Sei sulla pagina 1di 3

What is Denormalization

Denormalization refers to a refinement to the relational schema such that the degree of
normalization for a modified relation is less than the degree of at least one of the original
relations. Denormalization can also be referred to a process in which we combine two relations
into one new relation, and the new relation is still normalized but contains more nulls than the
original relations.

Normalization
Normalization is a logical database design that is structurally consistent and has minimal
redundancy. Normalization forces us to understand completely each attribute that has to be
represented in the database. This may be the most important factor that contributes to the overall
success of the system.
In addition, the following factors have to be considered: denormalization makes implementation
more complex; denormalization often sacrifices flexibility; denormalization may speed up
retrievals but it slows down updates.
Then why to denormalize relations
It is sometimes argued that a normalized database design does not provide maximum processing
efficiency. There may be circumstances where it may be necessary to accept the loss of some of
the benefits of a fully normalized design in favor of performance.
Benefits of Normalization

Normalization produces smaller tables with smaller rows:


More rows per page (less logical I/O)
More rows per I/O (more efficient)
More rows fit in cache (less physical I/O)
The benefits of normalization include:
Searching, sorting, and creating indexes are faster, since tables are narrower, and more
rows fit on a data page.
You usually wind up with more tables. You can have more clustered indexes (you get
only one per table) so you get more flexibility in tuning queries.
Index searching is often faster, since indexes tend to be narrower and shorter.
More tables allow better use of segments to control physical placement of data.

You usually wind up with fewer indexes per table, so data modification commands are
faster.
You wind up with fewer null values and less redundant data, making your database more
compact.
Triggers execute more quickly if you are not maintaining redundant data.
Data modification anomalies are reduced.
Normalization is conceptually cleaner and easier to maintain and change as your needs
change.
Good reasons for denormalizing are:
All or nearly all of the most frequent queries require access to the full set of joined data
A majority of applications perform table scans when joining tables
Computational complexity of derived columns requires temporary tables or excessively
complex queries
Disadvantages of Denormalization

Denormalization has these disadvantages:


It usually speeds retrieval but can slow data modification.
It is always application-specific and needs to be re-evaluated if the application changes.
It can increase the size of tables.
In some instances, it simplifies coding; in others, it makes coding more complex.
Performance Advantages of Denormalization

Denormalization can improve performance by:


Minimizing the need for joins
Reducing the number of foreign keys on tables

Reducing the number of indexes, saving storage space and reducing data modification
time
Precomputing aggregate values, that is, computing them at data modification time rather
than at select time
Reducing the number of tables (in some cases)

Potrebbero piacerti anche