Information management is a challenge. Failure patterns in information availability, may be directly related to the data modeling techniques that have been adopted by the organization for Cloud computing.  Data modeling can be defined as the process of defining and analyzing data requirements in support of business processes.  The implication is that stakeholder involvement in data modeling is an essential condition to data access and data availability to a specific user on the system.  Failure to involve one or more stakeholder in defining data access protocols can result in performance failures and user complaints.

A drill down from the symptoms of failure to the cause of failure may locate the point of failure at the database conceptualization level.  Database conceptualization is a set of technology independent specifications. These specifications are defined by the stakeholders or business users within an organization.  Points of failure may arise when stakeholders fail to identify a business need upfront. As a result, the business needs may not be serviced, and it may be perceived as a database failure.

Even where the business specifications are correct and all business requirements have been correctly identified, there may be failures of logical translation. The logical data model may be defective at a structural level for any or all of the following reasons:

  1. Failure to use appropriate software modeling tools
  2. Failure to consider use cases of different users
  3. Failure to map out dependencies
  4. Failure to completely and accurately define relationships between entities
  5. Failure to group attributes correctly
  6. Failure to define keys or set constraints properly
  7. Failure to orchestrate cascade updates or cascade deletes correctly

Consequently, the physical data model (organization of data elements and their relationships into tables and accounts) may perform sub-optimally.

In summary, effective database management requires the use of skilled personnel and state of the art software design tools. The database itself must be intelligently structured with components and objects clearly defined.  Input/output problems must be handled, segmentations, and partitions of tables must be well thought through; the level of normalization or indexing must be appropriate.  Redesign of the physical structure of the database at a later date is a nightmare in the making.  Quick fix approaches will bring in short term benefits and long term headaches.  Corporations, who are not willing to spend hours at the drawing board, end up compromising on long term database performance needs. Compounding the problem is the fact that application life cycles are getting shorter and Cloud-ability of data gets defocused when the application remains in focus.