SQL Server Performance Forum – Threads Archive
How much memory?Our sql server performance has deteriorated as the size of the database grew. I know there is already a hard drive bottleneck (which I’m working on), but want to know from someone with more experience how much difference memory makes and if adding more will alleviate some of the load on the drives, even though the paging doesn’t seem to be extreme yet. I’m trying to get an idea of how much of a difference memory will make before I spend the time and money to load a new OS and that supports more memory. We already have SQL Server 2005 that supposedly supports as much memory as the OS allows. Louder+Harder+Faster >= Better
if your database is growing because of build up on data from a given user base,
a good design should not require more memory
if it is growing because the user base has grown, then it is reasonable to add memory more than likely, you have a number of poor execution plans, from design and index issues
if the database was ok before, it might be because small table scans was no big deal, but now your tables are bigger so always analysis your top calls to sql server
then analyze the execution plans for the top calls
Yes, we’ve been working to optimize the sql and have cleaned it up quite a bit. We’ve gotten to the point where the amount of time put into optimizing doesn’t seem to give enough benefit in return. The db is 40GB and the server has 4Gb of memory, of which the OS (Win 2003 Standard) is allowing sql server to use about 2.7GB. The batch jobs that run throughout the day have very high IO, causing user’s selects to take a long time. I was wondering if the tables they were selecting from were already in memory waiting for them, if theier selects would not be as effected by the high IO on the drives as much(?) Louder+Harder+Faster >= Better
Did you update statistics and/or defrag indexes as part of maintenance plan? Luis Martin
SQL-Server-Performance.com All in Love is Fair
All postings are provided â€œAS ISâ€ with no warranties for accuracy.
I assumed those things were being done as pert of the database maintenance plan that is scheduled, but now that you mention it… I don’t see those options in it. How would I schedule them? Do I script them out myself and run the in sql server agent? Louder+Harder+Faster >= Better
Enable the AUTO UPDATE_STATISTICS by choosing database properties from options pane.
Also intermittently run UPDATE STATISTICS on the tables that are frequently used by the application. Also try to bulk load to a staging table and then compute your calculations for further import to main tables. Satya SKJ
Microsoft SQL Server MVP
Contributing Editor & Forums Moderator
This posting is provided Ã¯Â¿Â½AS ISÃ¯Â¿Â½ with no rights for the sake of knowledge sharing.
Satya, I understand "bulk load to a staging table", but am not sure what you mean by "compute your calculations for further import to main tables". The jobs typically read a recordset from the database into our vb.net application, then format the data and add required ID’s for linking to other tables, then insert each record one at a time back into the database as they are processed. Is there a more efficient way to put the data into the main tables? Often this is a few hundred thousand records in a single job. Louder+Harder+Faster = Better
How is the transaction wrapped for the batch job? per record? per hundred thousand records? per some set of records? How’s the primary key generated for the insert? Is the user suffering of contention of you inserting a record? How’s the cache buffer % utilization?