The current software package has over normalized data causing many queries to perform more that 3 joins (being modest). Therefore, we created a larger denormalized tables to target the longer running queries. This works fine and has improved performance but I am a bit skeptical. My skeptisicm is that there are 4-5 columns that are part of the clustered index and the columns involved have a legitimate reason for being part of the index. Thereare also about 4-5 non clustered indexes as well. Is there a better way to index? We are using triggers to populate the table could this degrade performance? The number of datapages is in the tens of thousands so should I recreate the tables and indexes? Need a new set of eyes my head is spinning.