SQL Server Performance

performance difference SQL versus stored procedure

Discussion in 'T-SQL Performance Tuning for Developers' started by Jaap, Apr 3, 2003.

  1. Jaap New Member

    We are running a website with allows car dealers to search for spare parts. So far we used ASP script with builds a SQL query string on the fly which is then excuted by the SQL server.

    We decided to upgrade to ASP.NET and to use stored procedures.
    Supprisingly the stored procedure runs much slower as when we executed the SQL directly

    The stored procedure is as follows:
    ______________________________________________________________________________________
    CREATE PROCEDURE [dbo].[spPartsSearchEngine]
    @partsID int,
    @countryID tinyint,
    @districtID smallint,
    @notStatus smallint

    AS
    SET NOCOUNT ON

    SELECT parts.partsID, pts_dealers.dealerID, pts_dealers.dealerName, pts_dealers.remarks, pts_dealers.city, pts_dealers.countryAddress,
    parts.countryID, parts.numberInStock, parts.dateUpdated, parts.status
    FROM parts
    LEFT OUTER JOIN pts_dealers ON pts_dealers.dealerID = parts.dealerID
    WHERE partsID IN (SELECT Finiscode FROM parts_Renumberings
    WHERE Finiscode2 = @partsID UNION ALL SELECT @partsID)
    AND parts.numberInStock > 0
    AND parts.status <> @notStatus
    AND parts.countryID = @countryID
    AND parts.districtID <> @districtID
    ORDER BY status ASC, dateUpdated DESC, numberInStock DESC
    GO
    ______________________________________________________________

    The parts table contains about 700,000 records of 5 integers each, Parts_renumberings table 600.000 of six integers each and the pts_dealers table about 300 records.


    When I execute this same SQL from the query analyzer it runs just fine. The query plan exactly matches that of the old ASP version (both versions are running on the same machine, and both databases are practically identical)

    Globally the execution plan looks as follows:
    - clusterd index seek on Parts_renumberings
    - table scan on parts
    - index seek on pts_dealers
    - nested loop to join parts and pts_dealers
    - bookmark look-up on pts_dealers

    When we use the stored procedure however the performance decreases from one hundreds of a second to about a half a second.

    We tried updating statistics but to no avail.

    The only way we found we could get the stored procedure to perform well (and even better than before) is by adding an index to the parts table and including an index hint in the stored procedure: "WITH (INDEX = IX_parts_partsID NOLOCK)"

    The problem however is that adding the index on the parts table slows down the inserting and deleting of records enormously. As this table is constantly updated this would lead to a unacceptable performance of the system.

    From the query analyser we learn that the stored procedure creates much more reads than when executing the query directly.

    Any help is appreciated,

    Jaap Kramer
  2. satya Moderator

    First of all for slow running queries use PROFILER and capture the trace, where you can see the set of events during this process. Submit the trace to Index Tuning Wizard which recommends indexes to be applied.

    Also make sure no other process is getting blocked for the same record.

    _________
    Satya SKJ
  3. gaurav_bindlish New Member

    Why is UNION ALL required in the query? Can you use just UNION as this is a peformance gain? I think looking at the procedure UNION should give the same results as UNION ALL.

    Gaurav
  4. bradmcgehee New Member

    Gaurav,

    Actually, its the opposite. UNION by itself causes the combined data to be sorted and duplicated eliminated. UNION ALL says to combine all the data, even if there are duplicates. UNION ALL uses less resources, and is the preference if you know that when combining the data that there will be no duplicates, of if you don't care if there are any duplicates.

    ------------------
    Brad M. McGehee
    Webmaster
    SQL-Server-Performance.Com
  5. trifunk New Member

    As a side note if you're using ASP.NET you should make sure you're using the SqlClient classes instead of the OleDB classes for performance reasons.
    Also use the SqlDataReader as it's the equivalent of a readonly, forward only cursor (ok it's not actually a real cursor) and it's faster than using DataTables and DataSets. Are you binding the data or just writing it out?

    Cheers
    Shaun

    World Domination Through Superior Software
  6. Argyle New Member

    quote:Originally posted by Jaap
    From the query analyser we learn that the stored procedure creates much more reads than when executing the query directly.
    How are you executing the SQL query and the stored procedure in QA when comparing them? I'll copy/paste what I wrote in a previous post:

    Note that the following query in QA

    ---------------------
    DECLARE @p1 INT
    DECLARE @p2 INT

    SELECT @p1 = 8
    SELECT @p2 = 5

    SELECT * FROM MyTable WHERE column1 = @p1 AND column2 = @p2
    ---------------------

    Is different from this one in QA

    ---------------------
    SELECT * FROM MyTable WHERE column1 = 8 AND column2 = 5
    ---------------------

    When you hard code the values like in the second example it is much easier for SQL to find the correct index. In the first example with @p1 and @p2, SQL Server has to rely on the index statistics. It's the same with an SP as I understand it and in some cases dynamic SQL can be faster than a stored procedure. Also if your data is scewed, your original SP execution plan that is generated and cached might not fit the next execution with different parameters. If that is the case you might have to create the SP with the "with recompile" option. That in itself will not create a faster SP but will prevent plans that are way off if your data is scewed.

    Try out different in-parameters and see if you always get the same behaviour with dynamic sql being faster than the stored procedure. Also do some tests with the "with recompile" option.

    /Argyle

Share This Page