Poor performing sprocs and order import problems | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

Poor performing sprocs and order import problems

SQL server novice in need of some general advice!<br /><br />I am currently responsible for the maintenance of a fairly complex ASP/SQL ordering system that is performing progressively worse as the database becomes more populated, and the levels of use increase.<br /><br />Many of the pages perform poorly anyway, but the whole system grinds to a halt when an order import is run.<br /><br />XML documents are periodically processed by a COM object which seems to then be calling an order-creation stored procedure for each order. Checks must be made to several tables to asses the type of the order, and then different tables must be updated and different calculations made depending on the order type. I think the order-creation sproc is using a T-SQL cursor.<br /><br />- Is a better order-import solution to import all order data to a temporary (or intermediate) table, use stored procs to perform SELECT queries to categorise the orders, and then use INSERTS and UPDATES within an explicit transaction to create the order records?<br /><br />- Or would it be better to do the checking in a client program (COM or .NET object), <br />using recordsets to capture the information for the initial checks? An ADO cursor could then be used to move through the list of orders, and once orders have been categorised, send a number of INSERT and UPDATE query batches to the SQL Server database within a transaction.<br /><br />Is either of these stratergies preferable, or is there a better approach?<br /><br />Any insights would be greatly appreciated! <img src=’/community/emoticons/emotion-1.gif’ alt=’:)‘ /><br /><br /><br /><br />
What are errors generated during this proces?
What is the level of service pack on SQL & OS?
Any information on event viewer for this behaviour? Satya SKJ
Moderator
http://www.SQL-Server-Performance.Com/forum
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
Generally speaking, I’d go with an import table and calling stored procedures to process the data on SQL Server. Two suggestions towards better speed: * Use IF EXISTS (SELECT…..) to avoid unnecessary UPDATEs, and perhaps also IF NOT EXISTS (SELECT…..) for INSERTs. UPDATE is a time consuming operation, even if no records are affected (same goes for DELETE). * Do the UPDATE before you do the INSERT: this way the UPDATE runs against a smaller target table, and is probably faster. That last suggestion would be inappropriate in case your UPDATE batch contains records that are also in your INSERT batch, because they would only get INSERTed, and not UPDATEd. So this depends largely on the flow in the client application.
Thanks for your responses! The error message below appears after a delay on the more processing intensive ASP that people try to access while the update is running:
Microsoft OLE DB Provider for ODBC Drivers (0x80040E31)
[Microsoft][ODBC SQL Server Driver]Timeout expired The OS is Win2K (service pack 3) with SQL Server 2000 (service pack 3). There don’t seem to be any messages logged in Event Viewer caused by this problem. I think the import is just tying up the processer while it is running (and possibly causing record locking problems?). The Task Manager on the database server often shows the processor running at 100% while the import is being processed. The COM object is running on the web server (which doesn’t show that much processor load), so it seems likely that it’s the stored procedure code, and the way that the sprocs are being used that’s causing the problems. For this reason I was thinking of trying to take more of the processing away from SQL server, but I think it is very likely that a more efficient system of stored procedures could be created.
Try the following:
1. Run a small .ASP page to collect the data.
2. Create a .udl to check the SQL Server connection.
3. Check the ASP connection string. Check this KBAhttp://support.microsoft.com/defaul…port/kb/articles/q175/2/64.asp&NoWebContent=1 for any relevance. Satya SKJ
Moderator
http://www.SQL-Server-Performance.Com/forum
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
Sounds like you might want to turn on Profiler to have a look for RPC:Completed SQL Batch:Completed taking more than say 100ms in duration or CPU time. This should give you some clue as to which procs to target for performance tuning. In terms of architecture. If the COM object only serves to read the xml and pass it to the database, then you could consider passing the XML directly to a stored procedure and then let SQL process it, check out OpenXML in Books Online Cheers
Twan
I suspect this is a performance issue where the order record generation is taking lot of time. I would like you to look at the steps carried out in the report generation and see if the underlying tables have appropiate indexes or not. Also debugging the stored procedure to check if there are some obvious bottlnecks may also be helpful. As suggested previously in the forum, having saperate tables for reports may also be considered. Gaurav
Moderator
Man thrives, oddly enough, only in the presence of a challenging environment- L. Ron Hubbard

The views expressed here are those of the author and no one else. There are no warranties as to the reliability or accuracy of anything presented here.
]]>