Hello, I have a stored procedure which is running slowly. Looking at the execution plan in the query analyzer I saw that the subtree cost is 3.5 and there is a bookmark lookup that takes up 65% of the cost. I added a covering index, and surely the subtree cost dropped to 1.75 (although logical reads increased, which is another puzzle). The problem is that the covering index I added includes many columns [4 varchar and a few ints] (had no choice, since the query is using all of them, and I need to cover them all with an index), and they add up to ~ 1200 bytes(?). Although the index was allowed to be added, the warning message came us saying that if the length of the index exceeds 900, inserts might fail. I believe they will fail if the data inserted in a row will exceed 900 bytes (as row cannot be split between 2 pages I believe). So, although I don't expect any data entered to exceed 900 ... but who knows - it seems unsafe to have this index. What do people do in this case? Just don't add an index and live with higher query costs? Please advise Thanks in advance!