Greetings, I have an application that parses huge flat log files, similar to the ones IIS has. From time to time the application has to dump partially aggregated data to sql server database, but it results in about 100k+ of inserts per log file. Such big amount of inserts hurt performance pretty much - while the application resources are freed after the log file is parsed, SQL server memory only grows and is not freed anyhow even after application shutdown. I`ve created temporary files to be used as BULK insert source for some of the tables, but some tables are referenced by other, so BULK insert most likely won`t work here. From what I`ve described, what would be your general performance considerations or maybe there're some obvious ways to bang this out? Thanks in advance!