We are using SQL Server 2000 as our back-end and an application called VFP as our front-end. At times, the application’s performance is very poor. We used Task Manager to identify that the sqlservr.exe process appears to use up memory when running queries, but it appears not to release the memory after the queries are complete. We believe that this may be the cause of our application’s performance problems.
Is there any option to release the additional memory that has been used by SQL Server after a query is completed?
Answer Based on what you have told me, I really doubt that your theory is correct. First, I am assuming that your application and SQL Server are running on the same physical server. If this is the case, the best way to reduce performance issues for either the application or SQL Server is to separate them on different servers. This is basic advice I give to everyone. SQL Server should always run on a dedicated SQL Server.
From what you have provided, it is hard for me to guess exactly what your problem is. I recommend you do both a Performance Monitor log (over at least 24 hours) and do a Profiler trace over the same period. The information provided will help you identify the performance problem you see in your application.
By the way, there is no way to tell SQL Server, through a SQL Server option, to give up memory that it has already taken and used when running a query. The closest you can come to this is to change the SQL Server’s memory configuration from dynamic (the default) to a fixed amount. This way, you can limit how much memory SQL Server can access on a server. But remember, if you do this, you may be harming SQL Server’s overall performance.]]>