Data and Database Management – Centralized versus Decentralized

By

Traditionally in many organizations, the control of data resources has been centralized due to the origin of data management within the mainframe environment.  However, with the emergence of client-server technology and the blending of data and process in the object-oriented methodologies, many organizations are questioning the need to retain the centralized data and database management functions.  These organizations are experimenting with the concept of decentralized data and database management, where application developers/programmers/integrators perform many, if not all, the functions of a data analyst and/or database analyst.  There are numerous risks associated with such a decision and few benefits.

Data analysis, database administration and application development/integration are very separate functions and require separate skill sets.  A data analyst, who is responsible for the conceptual and logical gathering and organization of information facts, is a person with broad-based analytical talents, is a good and discerning listener, has excellent oral and written communications skills.  These talents serve the role of data analyst by allowing the analyst to determine the relevant facts (data) in a business user’s description of the information needed to perform a function.  Frequently, this description flows in a “stream-of-consciousness” manner and the listening and analytical skills help the data analyst focus on the real entities and attributes instead of the inconsequential information.  This skill is developed through training and practice and is essential to the proper collection and organization of relevant data.

As the area responsible for the establishment and reusability of data, a data analyst is expected to understand the uses of each entity and its role in the corporate data management scheme.  This duty requires the data analyst to provide flexible yet solid definitions and usage of the logical entities and attributes found in all the organization’s data and file structures. In advocating and participating in the planning and coordination of the information resource across related applications and business areas, the amount of data sharing can be maximized, and the amount of design and data redundancy can be minimized.  Data analysts are also concerned with the metadata (definitions, standard names, formats, common code sets, etc…) of an object and its accessibility and central storage.

Perhaps more than any other of the discrete disciplines within IS, Data Administration requires a concrete grasp of the real business the company is in, not just the technical aspects of interaction with a computer.  Database administrators and application developers/integrators are not required to possess this level of business understanding.

The database administrator, a function separate from a data analyst, is a person with special skills relating to the DBMS under their control.  This physical data management function requires intimate knowledge of the DBMS, the platform it operates upon, and the performance and technical usage requirements of the application under construction or enhancement.  Proper database analysis and database structure and design can prevent the problems of poor performance and high maintenance databases, and the creation of unsharable data.

Defining proper access to the database, providing appropriate storage parameters and executing regular and robust maintenance routines (backup and recovery, performance monitoring, etc…) are all the responsibility of a database administrator.  These functions require the talents of technical expertise and tenacious problem solving.  It also requires detailed training in the DBMS’ operations, acquired through courses and practice.  Database administrators are usually less concerned with the business content of the data under their control than are data analysts, but they must understand the expected usage to design and enhance optimally performing databases.  The enhancement of database structures (adding or deleting columns, renaming columns or tables, etc.) must be done judiciously and by a technician skilled in the nuances of the database.  For example, Oracle does not provide a facility for dropping and renaming columns, a point known to Oracle DBA’s but not by many other IS professionals.

Application developers/integrators are expected to code and design the applications that provide data to the databases and present that data to the users.  Application developers are usually trained in the languages and interfaces of their applications, but are not usually concerned with the analysis of that data from a business perspective.  Since they work with data after the database has been structured, they frequently do not understand fully the need for normalized logical design.  This lack of understanding can result in incorrect normalizing of data if application developers perform database design or enhancement.  This improper normalizing can cause a database to perform poorly and require users to re-enter rather than reuse data in the application.

Also, application developers concentrate on a single application at a time.  Frequently, they do not have the broad, enterprise perspective necessary for the reduction or elimination of redundancy that is essential if data is to be used as a corporate resource.  The development of many stovepipe applications in the past is a result of data structure design by application specialists who were not considering the broader implications of sharing data and reducing data redundancy.

Many organizations considering the combination of data management and application development cite the need for swifter implementation of databases and more rapid enhancements to existing databases.  Sensitivity to deadline pressures in a constant throughout all development projects.  Decentralizing data management (logical and/or physical) appears to offer some slight advantages in faster application development.  However, the actual exposure to poorly defined data and poorly structured databases, incorrect enhancement procedures and unsharable data far outweigh the small saving in time resulting from application developers performing data management functions.  Industry studies have consistently shown high costs for redesign and re-enhancement when the logical and physical data management functions are not performed by data management specialists (data analysts and database administrators).  Effective project management practices suggest the division of labor into discrete tasks and each of those tasks to be performed by a specialist in that area.  Employing this management practice in the area of data management in a system-development or enhancement project will enable an organization to adequately maintain the costs of that project.  Simply stated, faster application development is not the objective.  The correct objective is the development and enhancement of high quality and high integrity applications as efficiently as possible.

Data is rapidly growing in stature as a recognized corporate resource.  A centralized approach to logical and physical data management will promote the development and use of integrated, sharable data throughout applications, preserve the quality of that data and serve the needs of the business more effectively.

 
Free Expert Consultation