SQL Server Performance

Error message Max tables exceeded 256

Discussion in 'Performance Tuning for DBAs' started by cejar, Nov 6, 2003.

  1. cejar New Member

    I'm getting this error message when I run a an SP. I'm query off views that join to multi tables but I know that they dont exeec 256. Here is the error "Could not allocate ancillary table for view or function resolution. The maximum number of tables in a query (256) was exceeded."

    Any help would be appreciated.

  2. ChrisFretwell New Member

    Can you a) post the query so we can see and/or
    b) in QA, display estimated query plan.

    I have a feeling, along with the tables in the underlying views, sql is probably creating working tables, to a magnatude enough to generate the error. If thats the case, then its an optimization issue, on the views, the sproc or both.

    Just out of curiosity, how many tables do you think its using?

  3. cejar New Member

    I'm guessing about 10. It´s not my sp. The guy I´m helping is going to minimize the number of joins and see if that works.

  4. cejar New Member

    He reduced the number of joins and still got the problem. Tomarrow we will run show est plan. By the way i just found out that there are approx 150 tables with out indexes on Foriegn keys.

  5. FrankKalis Moderator

    I assume you're not hoping of taking advantage of any existing indexes.
    I think 'Inside SQL Server 2000' states that any query joining more than four tables will not be optimized, because of the possible number of permutation on how to join the tables.

    Curious what this query looks like!

  6. cejar New Member

    Sorry for not getting back to you sonner, been very busy. Anyways, the SP is to big to for me to want to post it. The SP involved an insert into statement that involed many views that referenced many tables But in how I solved the issue was to run index tuning wizard on the on select statments from the (views). I indexed some of the tables everything seemed to work out. This reduced 20 min long Queries to 1 minute. I also ran a defrag because DBCC ShowContig and it reported under 40% scan dinsity on some of the more bigger tables one even reported 14%.

    Overall these databases need some serious maintenance.

  7. Luis Martin Moderator

    Indeed. Don't forget to update statistics from time to time. More if you don't reindex frequently.

    Luis Martin

    ...Thus mathematics may be defined as the subject in which we never know what we are talking about, nor whether what we are saying is true.
    Bertrand Russell
  8. Twan New Member

    Hi All,

    Just to make a comment on Frank's comment. A query with more than 4 tables will be optimized, but not fully. The statement will essentially be broken up into groups of tables and they are optimised. so a query with 12 tables may be broken into 3 sets by the optimiser where each set is optimised and then the join between the 3 sets is optimised. This is when a properly ordered set of tables can make a significant improvement.

  9. FrankKalis Moderator

  10. Twan New Member

    No I don't have any references at hand. I'll have a rummage around

  11. PattyLand New Member

    On many of the look up tables I noticed that they contained CLUSTERED INDEXES. Provided that the statistics were out dated by far. Do you think the query optimizer was utilizing these CLUSTERED INDEXES instead of Table scans.

  12. satya Moderator

Share This Page