SQL Server Performance

How to tune query on a table with Text columns

Discussion in 'SQL Server 2005 Performance Tuning for DBAs' started by Narine, Jun 10, 2008.

  1. Narine New Member

    Hi everone,
    We are using SQL 2005. We've got an email table that holds all our customers emails. Needless to say, it's very big, currently 4.5 mil rows and growing. The table has a bunch of text columns, such as EmailTo, CCto, emailBody. I have a Full Text Index created on it.
    The queries are taking a while to search on this table. One query that I'm trying to tune is doing a Key lookup (bookmark), which is 99% of the cost, to get all the columns in the Select clause. The columns in the Select clause include those of the text datatype, so can't create an index on those.
    There's also a problem when trying to sort on one of those text columns like 'emailTo'. I know about new datatype varchar(max) in 2005 that can replace Text. However, at this point it's not an option because our code has to be SQL 2000 compatible, since some of our clients that are hosting our application are still on 2000, and we need to be able to deploy code builds to them.
    Any help on how to tune this will be much appreciated.
    Thansk
    Narine
  2. techbabu303 New Member

    4.5 millions rows with SELECT pulling all the text type columns without any restrictive predicate leaves no option to tune using the indexes.
    On way would be restrict the number of rows being pulled say around 20,000 which will reduce the commands to batches.
    Cheers
    Sat

  3. Narine New Member

    Thanks you for the reply! The search criteria always includes UserID of the user viewing his email.
    I was finaly able to convince my manager to convert these TEXT columns to VARCHAR(MAX) . No reason why we have to suffer because of someone else defying the upgrade.
    Now I have questions for this conversion. Has anyone tried this before, altering columns from TEXT to VARCHAR(MAX)? I just tried it in our development environemnt on the table that has half of million rows. First, it converted very quickly, which I didn't expect. Second, the size of the database did not decrease, which I did expect to occur, since VARCHAR(MAX) is of a variable length.
    Also, because these columns are part of the Full Text Search, I had to remove them from FTI, alter the data type, then readd them to FTI. Does anyone know if this requires rebuilding FTI completely?
    Thanks
    Narine
  4. Rlaubertsr New Member

    How about partitioning the table by one of the common fields. that will reduce the number of records in each table and allow SQL to query a reduced record set.
  5. techbabu303 New Member

    Partitioning the table is good idea, forgot you are using the SQL 2005
    Instead of looking at size of database check the table size , that would give accurate picture of any changes on column types.
    Cheers
    Sat
  6. Narine New Member

    Thank you for the suggestion!
    How would you suggest paritioning it? Also we're using SQL2K5 Standard edition. Isn't automatic partitioning available in Enterprise only?
    Narine
  7. Rlaubertsr New Member

    Yes Table partitioning is available only in the Enterprise version. There are lots of articles on partitioning tables and indexes. For thestandard version the only option is partitioned views which I don'tfind an option in my databases and performance is not that good. Upgrading to Enterprise should be considered for both performance andexpandability.

    If this is an option, as long as you have a field that can be anindexed value, you can use it to partition the table. In my database,we have a companyid field that makes an ideal partitioning value. Dates are also ideal depending on the situation of course.

    Raymond Laubert
    MCITP, MCDBA, MCSE, MCT

Share This Page