DATABASE DEVELOPMENT

DATABASE DEVELOPMENT

Name

Course

Institution

Date

The improvement of data quality heavily relies on Maintenance; the maintenance plans and maintenance activities that are used. Maintenance in database development mainly focuses on the implementation of remaining requirements that include those that result from post-deployment needs for maintenance.

Maintenance is an important phase for the running system and it is considered a time for system evolution in a normal project. The choice of a system maintenance plan should be guided by a number of factors like the type of database that is being developed and the features that the plan offers. As such, a system project may choose a plan that is holistic and conclusive. For instance, these systems may include; the SCOM database maintenance plan, shrink database maintenance plan or the sharepoint database maintenance plan.

The System Center Operations Manager (SCOM) uses a single interface that can show information of a computer system’s state health and performance when a security situation is identified regarding the performance and configuration it provides alerts that are generated by the availability of threats.

The shrink database management system takes care of large volumes of data. The expansion of content and increase in application request and the daily addition of users to the system creates more sources of data, hence more data is generated. Data with higher granularity requires analytical tools that can go through vast amounts of data. The shrink database management systems helps with such challenges improving performance on load times and data retention.

SharePoint is developed by Microsoft and it has multipurpose web- based set of technologies based on common technical infrastructure that integrates intranet management and content and document management. Its system and process integration gives it work flow capabilities

There are activities that could be performed to increase the quality of data. These activities include data quality assurance which is a data profiling process to discover any inconsistencies and anomalies that can be found in the data while performing data cleansing activities like interpolation of missing data and removing outliers. Data quality control is used to control the data usage where data has a known quality measurement either in a process or an application. It follows discovery of inconsistencies in data and their correction. It provides the severity of inconsistency, the incompleteness, its accuracy, its precision and the missing or unknown data. In the optimum data quality, there are data quality checks that ensure the quality of data remains high. These data quality checks include; the completeness and precision, the validity check, the accuracy check, timeliness check, consistency check, reasonableness check and conformity and integrity check.

The integrated methodologies are efficient for planning proactive concurrency control methods and lock granularities. The reason is since they are an integration of the seminal methodologies and they have a process-centered attitude that targets many different and distinct applications. They achieve this through their fair trial at manageability to ensure customizability. They have much to offer in process components, patterns and measurement and management.

The verify method can be used to plan out the system effectively while ensuring that the number of transactions do not produce record-level locking while the database is in operation. The method aims at verifying the integrity of all the databases within the file that is specified within the file parameter. It also outputs the databases data pairs, optionally, to the outfile parameter specified file stream.

References

Ramsin R & Paige R. F, (2008) Process-Centered Review of Object Oriented Software

Development Methodologies. ACM Comput. Surv. 40. University of York