Implementing SQL Server in an OLTP Environment

Business application development exists as an increasingly complex process. Information technology managers and developers working for an organization that necessitates a business application are required to evaluate and implement appropriate application modeling strategies, software, database technologies and solution models.

It is often conceivable that during the application development process, technologies, methodologies and development methods can drastically change, altered by a myriad of changes in each of these areas. The goal of the organization is to develop an application to support current business processes while simultaneously adopting technologies and methodologies that will remain forward compatible with new developments in the information technology field. Typically, one of the first applications an organization will develop is an application to facilitate the transactional nature of its business. A transactional application will often serve as a platform to accept business information from customers or clients, process this information and deliver a predefined product or service. An Online Transaction Processing (OLTP) application environment will accurately facilitate the development and operation of such an application.

  OLTP Environment Characteristics

The OLTP environment consists of several layers. The bottom-most layer in an OLTP environment consists of the data layer. The data layer or data tier is composed of all data required by the organization’s business needs to complete business transactions.  The data tier can consist of legacy systems comprised of legacy data housed on mainframes, relational database data housed in SQL Server or Oracle, non-relational data such as Exchange Server data as well as other data, such as data supplied by accounting system software and other applications. The business logic tier consists of elements that employ business rules and logic to the data tier. The business logic tier maps business and organization rules and procedures to the transactional nature of the application. The business logic tier also is comprised of technologies that facilitate the transactional nature of the organization’s business, such as Exchange Server, Internet Information Server and other application services.  The presentation tier consists of the ‘thin’ client portion of the transactional application. The presentation and business object tier, formerly deployed at the client together, then separated, now exists as a ‘window’ to the actual business and data tiers. The presentation layer provides the client with a formatted view of the actual functionality of the business and data layers.

The client, existing only as a mechanism to facilitate interaction of the user with the application is typically a small or ‘thin’ deployment that can be implemented across a wide variety of platforms, including the Internet. Technologies such as Active Server Pages (ASP), VBScript, JavaScript and XML are designed to allow for the implementation of a thin client with rich functionality.

  SQL Server 7.0 and the Data Tier

SQL Server 7.0 is an integral part of the data tier. It is designed to work effectively in a number of application implementation strategies and solutions. By nature, SQL Server exists as a multitier client/server database system. In past implementations of OLTP models, SQL Server would participate in a two-tier client/server environment. The client machine would run an exe-based thick client application that would connect to the SQL Server and conduct business transactions. The thick client application housed the business logic and the code to display output to the user. In a multitier client/server environment, SQL Server 7.0 assumes a more active role in the OLTP process. By using several new features and technologies, SQL Server 7.0 has the ability to assist in the deployment of a distributed OLTP application and a thin client. In a multitier environment, the business logic is located on a separate server. The thin client is responsible for housing the code to display output to the user and accept input.  Through the use of SQL Server features such as application roles, Windows NT authentication as well as database roles and permissions, SQL Server assists in the deployment of a thin client application across a multitude of platforms, including distributed Internet applications. Middle tier or business logic tools such as Internet Information Server 4.0/5.0 (IIS) and Microsoft Transaction Server (MTS) fully integrate with SQL Server 7.0. This integration also assists in the creation of a thin client. By keeping the client as thin as possible, developers can adopt OLTP needs to new and emerging technologies.

For example, developers for an organization can migrate a client application that communicates with MTS from a Visual Basic rich client to a Visual Studio thin client interface using web forms that communicates with MTS and SQL Server via IIS in an Internet or Intranet distributed environment.

  Determining the OLTP Roadmap

Determining the implementation of the multitier model involves the recognition and evaluation of several factors. Primarily, organization business management, in conjunction with the information technology department, must develop a vision and scope for the application. Once the technical evaluation of OLTP multitier implementation begins, the scope of how the OLTP application will satisfy business needs should already be complete, supplemented with appropriate documentation.  In addition, business rule discovery and documentation should also be completed and well understood by all parties involved in the development of the application. Once the business rules have been finalized, OLTP strategy evaluations can begin.

Determine the Presentation Layer

The affirmation of the client type will serve as the foundation for the remainder of the OLTP model planning process. A thin client is preferable to a thick client because of ease of design, distribution and management.

Determining the client type involves selecting the platform on which the client will be built. Thin clients can be constructed using Active Server Pages in conjunction with VBScript or Java and distributed in an Internet and Intranet environment. Clients can also be Visual Basic executables, Visual Studio clients as well as comprised of other technologies such as Cold Fusion.

The type of thin client is usually a function of the organization’s developers core skill set. An organization’s management may decide to outsource the client creation process in favor of desired functionality (i.e., an Internet distributed application as opposed to a Visual Basic executable).

Determine the Middle Tier

The technologies involved in the middle tier are partially determined by the type of client that the organization plans to deploy. The complexity of the business rules and logic in conjunction with other factors also contribute to the evaluation and approval of middle tier technologies. Factors such as user community size, network traffic as well as client management and administration requirements all assist in determining which technologies will be deployed as part of the middle tier.

Technologies

  • Microsoft Transaction Server

  • Client Load balancing including distribution and management of multiple client transactions

  • Management of Business Objects and Rules

  • Direct communication with the SQL Server 7.0 data tier

  • Microsoft Internet Information Server

  • Web Application Services including secured authentication of clients (users)

  • Hosting of dynamic web content including Active Server applications and other Internet distributed applications

  • Support of transactional web applications

Determine the Data Tier

The data platform chosen by the organization will be a function of the businesses existing database technologies, the capability of the database to integrate and merge heterogeneous data from various sources as well as the personnel to assist in development and maintenance of the RDBMS.

SQL Server 7.0’s features lower total cost of ownership, the time required for traditional DBA activities as well as supplement the consolidation and presentation of data from relational and non-relational sources. Using features like Data Transformation Services (DTS), OLE DB, as well as it’s enhanced and redesigned transactional management system, SQL Server 7.0 is often the optimal choice for an organization’s OLTP needs.

SQL Server Technologies

  • Data Transformation Services (DTS) for the ETL of relational and non-relational data.

  • Reduction of administrative tasks through increased dynamic resource allocation and self-management and enhanced transaction management.

  • Support for a wide variety of APIs including ADO, OLE DB, DAO, ESQL, T-SQL and XML.

  • Ease of integration with the Microsoft Back Office family and Windows NT authentication.

Design the Data Model

High volumes of users or clients requiring concurrent connectivity characterize the Online Transaction Processing environment. Each client can conceivably initiate several data manipulation (DML) statements that SQL Server accepts, logs, processes and executes. As a result, a predefined data model must exist prior to physical model creation.

This model, crafted by database developers and administrators should directly support the multitude of small insert, delete and update statements indigenous to a typical OLTP application. Upon designing the logical data model, designers must account for user concurrency, atomicity, speed of information retrieval and the speed with which records can be updated or written. Characteristics of an OLTP data model include a high degree of normalization. The normalization of data into relational tables optimizes the speed of transactions performed by the OLTP application.

Normalization

Normalization of tables optimizes the efficiency of the transactions occurring in an OLTP environment. By using formal methods to separate data into multiple related tables, client activities in the database during production can be accommodated more efficiently.

An advanced degree of normalization will be conducive to the multitude of DML statements and reads in a transactional environment. This improvement in performance will increase transactional consistency and efficiency.

As the degree of normalization increases, the complexity of the data model invariably will increase. The amount of relationships and constraints which must be applied to the model to maintain referential integrity can become equally intensive as well as difficult to administer. Additionally, many complex joins between tables will begin to hinder performance. The determination of the degree of normalization should be comprised of a balance of normalization form methodology and performance considerations.

Data Integrity

OLTP database design should attempt to closely follow the rules and forms surrounding data integrity. When designing the OLTP model, Entity Integrity, Domain Integrity and Referential Integrity should be employed. Through the use of check constraints, primary keys, column default values and relationship constraints, the model will actively preserve data integrity.

Because the data model maintains business rules about the integrity of the data, logic involving appropriate values for columns, relationships and data validation is no longer integrated into the client. This approach allows for the slimming of the client into a purely presentational device.

Procedural integrity also allows for a thin client by assisting in maintaining data integrity through the use of stored procedures, triggers and check constraints. Using these SQL Server programmatic methods to ensure business rules on the data model will allow for the client to exclude the programming required to handle these procedures.

SQL Server default and rule objects are not ANSI standard and should be considered as a mechanism for backward compatibility with SQL Server 6.5 only. Column defaults and check constrains should be used instead of these legacy database objects.

OLTP Reporting

As an organization collects data, an inherent organizational need to analyze the transactional data and supply business users and analysts with this data will arise. However, because of the characteristics of an OLTP environment (multiple users concurrently inserting, modifying and deleting records), reporting becomes a resource intensive process.

Reporting in an OLTP environment will add additional ‘traffic’ to the data model. In addition to the typical DML activities, the database now has to support the reading and selection of data for read purposes. Because of the degree of normalization present in many OLTP models, the multiple joins and tables hinder read performance and increase the resources required for report generation. Simultaneously, the read operations required by reports are obtaining locks on records, utilizing indexes and performing joins to gather required information. These operations will affect the speed and efficiency of transactional operations

Online Analytical Processing (OLAP) reporting in the context of a data warehouse will effectively alleviate the strain placed on an OLTP schema by reporting requirements. OLAP or data warehouse solutions will also increase read efficiency for information consumers by gathering and transforming OLTP data into a star schema. The star schema is optimized for read efficiency and can be populated nightly.

Continues…

Leave a comment

Your email address will not be published.