270 likes | 360 Views
Topic Denormalisation. The result of normalisation is a logical database design that is structurally consistent and has minimal redundancy. So it’s all perfect. Is it?. Question?.
E N D
TopicDenormalisation Advanced Databases
The result of normalisation is a logical database design that is structurally consistent and has minimal redundancy. • So it’s all perfect. Is it? Advanced Databases
Question? Advanced Databases • Does it ever make sense to deliberately relax normalisation rules and deliberately introduce redundancy into the system.
Answer • The answer is yes, but only when it is estimated that the system may not be able to meet its performance requirements.
Denormalisation • A fully normalised system does not necessarily provide maximum processing efficiency.In this situation introducing redundancy in a controlled manner by relaxing the normalisation rules will improve the performance of the system.
Denormalisation • When we talk about denormalisation we are not just talking about forms. • For example, we may decide to have some portion of the logical data model in 2NF and the rest in 3NF. • In general we are loosely using the term to refer to situations where we combine relations and the new relation is still normalised but may contain nulls but there are other techniques
Denormalisation • Often in normalisation, you split a table into two tables (like our various examples) • Then when you need data from both tables, you need to access two tables instead of one. • In some situations we may decide to leave the relations in 2NF – because it’s faster.
Denormalisation • Normalisation is still very important for database design. • In addition the following factors have to be considered: • Denormalisation makes implementation more complex. • Denormalisation often sacrifices flexibility • Denormalisation may speed up retrievals but it slows down updates.
Denormalisation techniques • Denormalisation uses various techniques • but techniques used will depend upon on usage of the database) • Consider the use of these techniques for frequent or critical transactions: Advanced Databases
Consider the introduction of controlled redundancy (denormalisation) • Some techniques • 1 Combining 1:1 relationships • 2 Duplicating non-key attributes in 1:* relationships to reduce joins • 3 Duplicating attributes in *:* relationships to reduce joins • 4 Introducing repeating groups • 5 Partitioning relations into smaller chunks. Advanced Databases
Sample Relation Diagram Advanced Databases
Sample Relations Advanced Databases
1 Combining 1:1 relationships Queries on the interview details for a client are very frequent in this database. Two separate tables. So combining the two tables will be faster (it’s faster to query one table than to join two) Advanced Databases
2 Duplicating non-key attributes in 1:* relationships: Lookup Table This table shows all properties available for rent. Have to go to lookup table to “translate” type Advanced Databases
2 Duplicating non-key attributes in 1:* relationships: Lookup Table duplicated Advanced Databases
3 Duplicating attributes in *:* relationships to reduce joins When you’re using an intermediate or bridging table, can perhaps add duplicate attributes to the bridging table to speed up queries e.g. Furniture store: Orders table linked to order items, Which is linked to products How do I show all orders for products of type chair? Add product type to order_items table for speed. Advanced Databases
4 Introducing repeating groups But only where there is a limited number of groups e.g. max of three telephone numbers Advanced Databases
5 Partitioning relations • Rather than combining relations together, alternative approach is to decompose them into a number of smaller and more manageable partitions • Two main types of partitioning: horizontal and vertical. • Very important technique for performance tuning Don’t give me such a big table to search… divide it up to help me How do I decide what to chop the table into? Advanced Databases
5 Partitioning relations Handy if there’s a natural split e.g. Customers in Dublin in one table, other customers on a 2nd table Vertical Maybe some columns aren’t used much Advanced Databases
5 Partitioning relations • Supposing.. • Mobile phone company has a customer service system. All service calls logged. • Transaction description: When a phone call is received, the customer service clerk usually searches the database for a customer call for a specific day. Search for a customer’s calls for a specific day. • There are 1M customers. Each makes on average 2 calls per day • 2X365M records added per year • It could take days to query calls for a specific customer • Use partitioning? Advanced Databases
5 Partitioning relations Advanced Databases Using Partitioning Partition the customer_calls table into separate partitions per customer? Per day? – this one is easier because you don’t have to keep “old” partitions up to date Search space will go from M+ records to 2M..
5 Partitioning relations Advanced Databases 1. Smaller and more manageable pieces of data ( Partitions )2. Reduced recovery time3. Failure impact is less4. import / export can be done at the " Partition Level".5. Faster access of data6. Partitions work independent of the other partitions.7. Very easy to use
Database Design We’ve looked at techniques and concepts ERDs entities, relationships, cardinalities Normalisation Denormalisation
Database design phases DBMS must be known DBMS not necessarily known Conceptual Logical Physical Normalisation done in this step Denormalisation In this step
Design methodology defines 3 phases Conceptual The process of constructing a model of the data used in an enterprise independent of all physical considerations Logical The process of constructing a model of the data used in an enterprise based on a specific data model, but independent of a particular DMBS and other physical considerations What? Physical The process of producing a description of the implementation of the database on secondary storage; it describes the base relations, file organisations and indexes used to achieve efficient access to the data, and any associated integrity constraints and security measures. How?
Summary • Sometimes after normalising, need to revisit design in order to improve performance • Denormalisation investigated for frequent or critical transactions • Various techniques