SQL Server Performance

Tuning Large table to get count

Discussion in 'SQL Server 2005 T-SQL Performance Tuning' started by mvayner, Mar 25, 2008.

  1. mvayner New Member

    Hello,
    I am working on a very large table which has over 200 million rows.
    The table is actually very simple and can not be normalized further.
    CREATE TABLE [dbo].[Company](
    [ID] [bigint] IDENTITY(1,1) NOT NULL,
    [CompanyName] [nvarchar](255) NOT NULL,
    CONSTRAINT [PK_Company] PRIMARY KEY CLUSTERED
    (
    [ID] ASC
    )WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY]
    ) ON [PRIMARY]
    What I am trying to do is get a count of records based on criteria.
    So for example:
    SELECT count(ID) FROM dbo.Company WHERE CompanyName LIKE 'a%'
    Yes I have a clustered index set up on ID and a non-clustered index on companyname.
    The database is on a dual-core 2.4Ghz SQL Server (standard) with 2GB RAM.
    The count is taking over 1 minute and returning a count of 20 million. Remember, I am just interested in the "count" not the rows.
    Any clues or ideas on how to optimize?
    Thanks in advanced!
  2. ndinakar Member

    Check the query plan to see if the index is being used. Besides that, check if there is any bottleneck in IO or CPU..Use perfmon counters..you might need to add some muscle to the machine..it looks like the metrics are based on a desktop.. is that your production server?
  3. satya Moderator

Share This Page