You have already come across the basics of what methodologies are and its stages. You have gathered the basic concept of what conceptual methodology is and how it works within the main stages of the database system development life cycle.
This stage is made up of three phases:
- Logical and
- Physical database design
In this chapter you will learn and understand the basic concepts of Logical Methodology, i.e. the second stage of database development life cycle.
Details on Logical Methodology
A local logical data model is used to characterize the data requirements of one or more but not all user views of a database and a universal logical data model represents the data requirements for all user views. The final step of the logical database design phase is to reflect on how well the model is able to support possible future developments for the database system.
Logical Database Design Methodology for the Relational Model
The objective of logical database design methodology is to interpret the conceptual data model into a logical data model and then authorize this model to check whether it is structurally correct and able to support the required transactions or not.
In this step of database development life cycle, the main purpose is to translate the conceptual data model created in conceptual methodology (of previous chapter) into a logical data model of the data requirements of the enterprise. This objective can be achieved by following the activities given below:
- Obtain the relations for logical data model
- Authorize those relations using normalization
- Validate those relations against user transactions
- Check integrity control and its limitation
- Evaluate logical data model with user
- Combine logical data models into global model (This step is an optional one)
- Check for future growth and development
The structure of the relational schema is authorized using normalization and then makes sure to ensure that the relations are capable of supporting the transactions given in the users’ requirements specification. You can then check that all important integrity constraints that are characterized by the logical data model. At this stage, the logical data model is authorized by the users to ensure that they consider the model to be a true demonstration of the data requirements for the enterprise.
Derive Relations for Logical Data Model
The relationship that an entity has with other entity is characterized using the primary key or foreign key’s concept. In deciding where to post the foreign key attribute(s), firstly you must have to identify the ‘parent’ and ‘child’ entities that are involved in that relationship. The parent entity refers to the entity that posts a copy of its primary key into the relation that represents the child entity, to act as the foreign key. You can describe how relations are obtained for the following structures that may occur in a conceptual data model:
- strong entity types
- weak entity types
- one-to-many (1:*) binary relationship types
- one-to-one (1:1) binary relationship types
- one-to-one (1:1) recursive relationship types
- superclass/subclass relationship types
- many-to-many (*:*) binary relationship types
- complex relationship types
- multi-valued attributes
Validate Relations Using Normalization
In the previous step you have derived a set of relations to signify the conceptual data model created in earlier step. Now, in the next step you have to validate the groupings of attributes in each relation using the rules of normalization. The purpose of normalization is to ensure that the position of relations has a minimal and yet sufficient number of attributes necessary to support the data requirements of the enterprise.
Validate Relations Against User Transactions
The main purpose of this step is to validate the logical data model to make certain that the model supports the required transactions, as the users’ requirements specification. By using the relations, the primary key / foreign key links within the relations, the ER diagram, and the data dictionary, you can attempt to perform the operations manually. If you are able to resolve all transactions in this way, you can validate the logical data model against the transactions.