Hi, I'm experiencing a hugh performance difference between 3 SQLServer setups where the best/quickest machine has the worst performance. I'm out of ideas where to look. I have a very simple database with 2 tables each containing 1 nvarchar(35) column with a unique non clustered index on that column. Both tables contain roughly 850K unique entries, with some differences I'm running the query: select column1 from table1 where column1 not in (select column2 from table2) (i'm aware that the SQL: select column1 from table1 where column1 not in (select column2 from table2 where column1=column2) performs a lot better, but that is not the point I'm investigating) On a desktop, running SQL2000 8.00.2039 - Developer on a Windows XP OS with an intel duo processor, 1 GB Ram, this query takes 4 seconds On a desktop, running SQL2005 9.00.4035.00 - Standard Edition on a Windows XP OS with intel duo processor, 2 GB Ram; this query takes 35 seconds On a server, running SQL2005 64-bit 9.00.4035.00 (X64) - Developer on a Windows 2003 64bit OS with an Intel Xeon quad core processor, 16 GB Ram, this query takes more then 12 minutes!! I've checked the execution plans and these are equal on all 3 machines (of course a table scan is performed in all the plans (rather then using the index) but that is not the issue I have to resolve). All database properties between the 2 SQL2005 setups are compared and are equal. Text search/indexing etc. is disabled / removed Can anyone point me in another direction as of where to look for this huge difference in performance? Is it as simple as Developer (2005) versus Standard (2005) ? Thx for any indication!