Monitor Your SQL Server’s Performance with SQL Power Tools Zero Impact Product Line

Monitoring a SQL Server-based application’s performance, whether its one you have purchased off-the-shelf or have written in-house, is a complex task involving many variables. But when it comes right down to it, it all depends on the user’s perception of performance. After all, our purpose in life (as DBAs) is to provide the best overall performance for our users. Let’s look at an example.

Debbie is a data-entry operator for an ERP application. She sits at her desktop all day, every day, entering accounts receivable data. She knows her job well and is a fast typist. In a typical day, she enters or updates over 100 invoices. Her company is growing fast and the number of invoices keeps increasing. Soon, she is having to work overtime to keep up with the number of new invoices she as to enter, sometimes as many as 150 per day.

Her biggest frustration is that she could enter all 150 invoices her in regular 8 hour day, but the problem is that the ERP package is too slow. Before the increase of work, waiting for screens to appear; inserts, updates, and deletes to complete; and reports to print, wasn’t a problem. But now that she is busier, and with the waits getting longer, she feels frustrated that she has to sit around doing nothing as she waits for the ERP system to catch up. If the system were faster, she could get home earlier each night, saving the company overtime.

By the time you add up the cost of wasted time for Debbie, and perhaps dozens, if not hundreds of others using the ERP system, the cost of the slow ERP system is very expensive.

Many times after a new SQL Server-based application is first installed, whether its an ERP application or otherwise, performance is often acceptable. But slowly, as data is added to the database and as more users begin using it, the overall application begins to slow down. This may take a month, six months, or maybe longer than a year, but it eventually slows down.

Don’t blame SQL Server solely for the problem. Performance problems are often complex and are due to a wide variety of reasons, ranging from poor application design, bad query design, unexpectedly heavy usage, poor network performance problems, or one of many dozens of different reasons. But whatever is causing the problem, it needs to be identified and corrected. And the sooner the better.

Most users won’t notice small delays that accumulate over time. But at some point, they will realize that their application is running slower than it used to run. And even if they do notice a degradation in performance, they may not tell anyone until it is too late to do much about it.

Wouldn’t it be better if we had a system that would be able to monitor our SQL Server-based applications, look for poor end-user response times, and then let us know that they are occurring long before they become a major problem? In addition, wouldn’t it be great if we could use the monitoring data we have collected and use it to help us analyze the cause of the performance problems, helping us to resolve them? And last of all, wouldn’t it be ideal if our monitoring system didn’t itself impede our application’s performance?

While SQL Server does include some performance monitoring tools, they aren’t comprehensive or always easy to use. To help fill this gap, Sql Power Tools has developed a suite of tools to help DBAs better monitor and analyze performance-related problems with SQL Server.

The Sql Power Tools Suite

Sql Power Tools offers a suite of tools to help monitor SQL Server. They include:

  • Zero Impact Monitor (base product)
  • Zero Impact Service Level Monitor
  • Zero Impact Top N SQL Analyzer
  • Stealth Blocked Process Monitor
  • eData Usage Analyzer
  • Sql Loopback Monitor (currently available for UNIX environments, but will be available for Windows in 2004)

Each of these tools work together to help DBAs gather and analyze SQL Server data that is being transmitted between users and SQL Server. This is true whether the environment is two-tier or n-tier. Here’s a look at each tool, what it does, and how it works.

Zero Impact Sql Monitor

The Zero Impact Monitor is a software agent that listens to traffic between end-users and SQL Server. It captures, monitors, and analyzes the performance of all SQL transactions in real-time, which allows you to measure “end-user response” time, “database server response” time, and “client network receive” time.

End-user response time is a measurement of 1) the network time it takes for a message to be sent from the client to SQL Server, 2) the database server response time, and 3) the network time it takes for a response to be sent from the SQL Server to the client. Database server response time is a measurement of the difference in time between the first result packet sent to the client by SQL Server and the last SQL Server packet received by the SQL Server from the client. Client network receive time is a measurement of the difference between the time of the last result packet received by the client and the time of the first result packet received by the client.

Many other monitoring products, such as SQL Server’s Profiler, use up vital resources on the SQL Server being monitored. The Zero Impact SQL Monitor, just as its names implies, produces zero impact on the SQL Servers being monitored. This is because the agent software is not installed on the SQL Server’s themselves, but on other devices in your network, such as a Windows 95/98 desktop, an XP Workstation, or Windows 2003 Server box. If you run in a switched environment, the agent can take advantage of port mirroring to monitor your SQL Servers.

Here are some of the key features of Zero Impact SQL Monitor:

  • You can capture 100% of Transact-SQL code (none of the SQL is truncated) and performance data in real time. If desired, this data can be archived to a SQL Server database, or exported to other software for additional analysis.
  • The real-time monitoring data includes: end user response time, database server response time, and network time. In addition, you can view the Transact-SQL code that is currently being executed by the server. Data can also be grouped for easier analysis, such as by server, by application, by end-user, or even by specific Transact-SQL code.
  • Data can be filtered, included, excluded, or reconstructed, as needed, to make your analysis easier.
  • Besides collecting this data, real-time or off-line alerts by SQL statements can be created based on the SQL statements being sent between the client and server. This allows you to more quickly identify potential problems or to troubleshoot known issues.

But this is only the base component of the suite. It also includes the following components.

Continues…

Pages: 1 2 3




Related Articles :

  • No Related Articles Found
Uncategorized

No comments yet... Be the first to leave a reply!

Software Reviews | Book Reviews | FAQs | Tips | Articles | Performance Tuning | Audit | BI | Clustering | Developer | Reporting | DBA | ASP.NET Ado | Views tips | | Developer FAQs | Replication Tips | OS Tips | Misc Tips | Index Tuning Tips | Hints Tips | High Availability Tips | Hardware Tips | ETL Tips | Components Tips | Configuration Tips | App Dev Tips | OLAP Tips | Admin Tips | Software Reviews | Error | Clustering FAQs | Performance Tuning FAQs | DBA FAQs |