SQL Server 2000 AND IIS on same box | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

SQL Server 2000 AND IIS on same box

Hi All, My site has been running on a dedicated PIII 1.1GHz 512MB RAM 40GB IDE Win2K server connecting to a shared MS SQL 2000 server.
My IIS Server was getting bogged down due to multiple users hitting my site, 1000+ users.
I decided to move much of my processing from my ASP pages to Stored Procedures which, from everything I’ve read is a smart thing to do. I did this successfully and did notice that pages were loading faster. My testing was done during off-peak times.
The first time that I hit a peak period with the 1000 users hitting me, the shared SQL Server started to choke! According to my host, I was over my resource limit (disk space/processor/memory) and that I’d need to move to a dedicated SQL Server. They put MS SQL Server 2000 on the same box I described above. I suspect that my site will fail miserably with this configuration as there is no way both IIS and SQL Server will be happy running in only 512MB of RAM. Are my suspicions correct? Also, I have a number of pages that query and user stored procedures each time even if the data has not changed. I’ve changed the ASP pages. to write the data from these pages to disk and then on subsequent page loads, read the data from disk rather than retrieve it from SQL Server. Basically I put the burden on Disk I/O instead of SQL Server.
Was this a good move or will I cause IIS to choke due to disk access and memory useage? Thanks in advance,
Larry
Hard to answer without knowing how much memory IIS and SQL uses today. If SQL wants all the 512 MB you need to configure a max limit like 250-350 MB and leave the rest to the OS and IIS or you might run into big problems with swapping. You would need to monitor that over time.<br /><br />Offloading data to disk that IIS later on most likely will put in it’s cache sounds like a good idea. Depends on how it is implemented though. You could put the data directly into memory instead of disk with 3rd party DLLs (or something custom made) but however you do it, you need memory eventually <img src=’/community/emoticons/emotion-1.gif’ alt=’:)‘ />. <br /><br />I would try and get some figures on the CPU and memory usage for IIS and SQL if possible. If 512 MB isn’t enough then you need to add more memory or tweak the queries and design even more (like using clientside ADO cursors in ASP, finding even better indexes etc)<br /><br />/Argyle
Hi Larry, Argyle has already covered the points I think… A couple of possible additions – How big is the database? 512MB RAM is not a hug amount, and a single CPU PIII is also not THAT highly spec’d, yet you have > 1000 concurrent users? Are the users mainly reading data or mainly writing data?
– as Argyle suggests, set the max memory for SQL to give IIS some breathing space. If you can, then monitor memory usage of the inetinfo.exe and sqlserv.exe processes (and any dllhost.exe) also monitor cpu and pages/sec.
– use Profiler to check that no queries are taking longer than say 100ms (Batch:Complete and RPC:Complete duration>=100 will give you this)
– as for caching stuff in ASP, it can be a good idea as pointed out be Argyle, although it does depend on how it is cached… SQL is very good at caching too, so in a lot of cases there is not a lot to gain by caching data in IIS (depends on the complexity/cost of the query versus time to retrieve stuff from the home-built IIS cache) Cheers
Twan
Thanks for the input! Well, the site is an NFL related site and I get bombarded every Sunday. My server is hosted so I’ve been having trouble getting detailed info from them about CPU/Memory useage. Today was the first day where the new config I described in my original post got a workout. It failed. IIS seemed to be happy but SQL Server was choking with messages like: Microsoft OLE DB Provider for SQL Server error ‘80040e14’
Could not perform the requested operation because the minimum query memory is not available. Decrease the configured value for the ‘min memory per query’ server configuration option. and Microsoft OLE DB Provider for SQL Server error ‘80040e14’
There is insufficient system memory to run this query. There are mostly reads going on during the day but every 20+ minutes, my system updates the database based on game scores. This update is done entirely in a stored procedure.
Users start seeing the above message when they go to pages that read data.
When things really started to get bogged down, I noticed that my stored procedure would not complete. I would get stuck on some simple SELECT statements using the aggregate function COUNT(). The stored procedure is doing a lot of counting and updating about 10000 records. What would cause it just to halt seemingly out of the blue?
Am I expecting too much out of the PIII 1GHz with 512MB RAM? Thanks,
Larry

Microsoft OLE DB Provider for SQL Server error ‘80040e14’
There is insufficient system memory to run this query.

2 alternatives to get rid of this issue : To reduce the amount of memory used by the optimizer to search an optimal plan, you can use the Force Order query option hint to force the join order. However, forcing join order may lead to a less than optimal plan, which in turn may degrade performance. For more information about how to use the Force Order query hint, refer to the "OPTION Clause" topic in SQL Server Books Online. and the another one is to increase the max amount of memory for SQL Server, by adding up physical memory to the box. Microsoft OLE DB Provider for SQL Server error ‘80040e14’
Could not perform the requested operation because the minimum query memory is not available. Decrease the configured value for the ‘min memory per query’ server configuration option.

BOL specifies, Resubmit the query. If resubmitting the query does not allow the query to run, you can:
quote:
Add additional memory to the server.
Create one or more indexes.
Reduce the value of min memory per query.
Increase the query wait configuration option if it is a nonzero value.
_________
Satya SKJ
Moderator
SQL-Server-Performance.Com

Hi Larry, hmmm… to be honest I’m not sure what is going on there… Are you able to find out from your hosting provider:
– How much memory each of the processes on the server is using
– results of an sp_configure (with advanced options set to 1)
– Peak number of concurrent IIS sessions
– Peak number of concurrent SQL connections How does the application connect to the database?
Is it somehow preventing database connection pooling?
Are they all using the same user id? database?
Is everything done through stored procedures?
How complex are the database reads?
You have a single IIS site?
How long does your update procedure take? How complex is it? Cheers
Twan PS sorry lots of questions, not many answers…
Thanks Satya, I’m trying to find out how many $$ it will cost to bump my physical memory now.<br /><br />Twan, I don’t mind the questions!<br /><br />I don’t have user permissions to run an sp_configure with advanced option 1.<br />I am asking questions for each one of your bullets to my host right now.<br /><br />Answers to some of your questions:<br /><br />&gt;&gt;How does the application connect to the database? <br />I have a single connection include file that my ASP pages include with a function called Connect() and I use a string that looks like this:<br /><br />ConnStr = "Provider=SQLOLEDB;Driver={SQL Server};Server=www.servername.net;Uid=myuid<img src=’/community/emoticons/emotion-4.gif’ alt=’;P’ />wd=mypassword;Database=mydatabase"<br /><br />&gt;&gt;Is it somehow preventing database connection pooling? <br />What would prevent database connection pooling?<br /><br />&gt;&gt;Are they all using the same user id? database? <br />Yes according to my include file I mentioned.<br /><br />&gt;&gt;Is everything done through stored procedures?<br />Well, I was mostly using ASP only with SQL queries but then I took a lot of the processing out of the ASPs and move it into stored procedures to easy the load on IIS. That killed my shared SQL Server. <br /><br />&gt;&gt;How complex are the database reads?<br />Most of my reads are doing simple SELECT’s some with JOIN’s and then in some cases I’m doing COUNT() and SUM() functions in some SELECTS’s<br /><br />&gt;&gt;You have a single IIS site?<br />This is a single IIS site.<br /><br />&gt;&gt;How long does your update procedure take? How complex is it?<br />Well, on my new SQL Server on my dedicated box, my update stored procedures takes about 2 mintutes during dead time and 10 minutes during heavy times but on occasion it just hangs at random spots. I added some print statements to monitor its progress.<br />The stored procedure is 509 lines long. It is doing some looping. For each member, it updates records based on that members settings, so in each loop, there are sub selects some using COUNT() and SUM(). Then some updates are done.<br /><br />Another questions:<br /><br />I’m calling my stored procedure with a connection object rather than a command object. Is this a bad thing?? So, with my Connect() using Server.CreateObject("ADODB.Connection")the above connection string, a call from ASP might look like this:<br /><br /> Set DB = Connect() <br /> Set RS = DB.Execute("exec spMyUpdateProc " & param1 & ", " & param2)<br /><br />Is this bad?<br /><br />Thanks for the help!<br />Larry<br /><br /><br /><br /><br /><br /><br />
1.<br />You connectionstring looks a bit strange. You provide both OLEDB and ODBC parameters.<br /><br />Provider=SQLOLEDB means OLEDB<br />Driver={SQL Server} means ODBC<br /><br />I would remove the ODBC reference and create the connectionstring like this:<br />ConnStr = "Provider=SQLOLEDB;Data Source=www.servername.net;Initial Catalog=mydatabase;User Id=myuid<img src=’/community/emoticons/emotion-4.gif’ alt=’;P’ />assword=mypassword"<br /><br />Not that it will effect performance in a big way but it’s cleaner <img src=’/community/emoticons/emotion-4.gif’ alt=’:p‘ /><br /><br />2.<br />You specify how you open your connections but do you close them as well?<br /><br />3.<br />You say you loop in the update SP. Do you loop with CURSORS or with a WHILE statment?<br /><br />4.<br />The way you execute your SPs will result in serverside cursors by default (i.e handled in tempDB in SQL server). If you use the command object instead you could specify clientside cursors and disconnect the recordset to use the ADO cursor engine instead. This is often quicker if you are not returning large amount if data with each SELECT. It will use the memory on the webserver but since you share IIS and SQL it might not be a big difference here (but you get away from using the tempDB). Using the ADO cursor engine will not help for your update SP, it’s most effective when returning data.<br /><br />Also you are open for SQL injection when you just send DB.Execute("exec spMyUpdateProc " & param1 & ", " & param2) straight in unless you remove or change characters like ‘ in your param code before you execute it.<br /><br />Example of how to get a disconnected clientside recordset with ADO in ASP:<pre>—————————<br />Function GetAuthor(authorID)<br />Dim rs, cmd<br /><br />’Instantiate objects<br />Set rs = Server.CreateObject("adodb.Recordset")<br />Set cmd = Server.CreateObject("adodb.Command")<br /><br />’Open connection <br />cmd.ActiveConnection = GetConnectionString()<br /> <br />’Specify SP<br />cmd.CommandText = "dbo.GetAuthorByAuthorID"<br />cmd.CommandType = adCmdStoredProc<br /><br />’Specify parameters<br />cmd.Parameters.Append cmd.CreateParameter("@authorID", adInteger, adParamInput, , authorID)<br /> <br />’Execute the query with clientside cursor<br />rs.CursorLocation = adUseClient<br />rs.Open cmd, , adOpenStatic, adLockReadOnly<br /><br />’Disconnect the recordset and clean up<br />Set cmd.ActiveConnection = Nothing<br />Set cmd = Nothing<br /><br />’Return the recordset<br />Set GetAuthor = rs<br />End Function<br /><br />Function GetConnectionString() <br /> GetConnectionString = "Provider=sqloledb;Data Source=MyServer;Initial Catalog=MyDB;User Id=MyUser<img src=’/community/emoticons/emotion-4.gif’ alt=’;P’ />assword=MyPass;App=MyApp"<br />End Function <br />—————————</pre>Microsoft has released some pages called ADOHelper.asp (or DBHelper.asp) that include optimized ways to call your stored procedures in ASP. They should be available on the net. The quicker you execute your statments the faster resources can be released and used by new connections <img src=’/community/emoticons/emotion-1.gif’ alt=’:)‘ /><br /><br />/Argyle
Thanks Argyle! That was a wealth of info! Here are my responses: 1. I will definitely clean up my connection string. Thanks!
2. Just as I have a connection include file I also have a clean-up include file that close all recordset and I thought it was explicityly calling a close on all DB connections but instead I just do a Set DB = nothing. That’s probably not good enough, is it? In my clean up file I do something like this for recordset I use: ‘If it’s a recordset and open, close it
if TypeName(RS) = "Recordset" then
if RS.state <> adStateClosed then
RS.close
end if
end if Can I do something similar for my DB connection to know if I need to close it or not? I know that on some of my page, I explicitly call DB.close but I thought I was making that call in my clean up include file as well, but I’m not. 3. My update procedure skeleton loops like this:
declare @cur1 cursor
declare @col1 int, @col2 int
set @cur1 = cursor local scroll for
SELECT col1, col2 FROM PoolMembers ORDER BY PoolID
open @cur1
fetch next from @cur1 into @col1, @col2
while @@FETCH_STATUS=0
begin
/* do work here */
/* some selects with aggregate function COUNT(*) and other logic */
/* then of course an UPDATE */ fetch next from @cur1 into @col1, @col2
end 4. I do some SQL injection protection on my param variables. For the update procedure none of the params come from user input anyway. Now that you mentioned something about tempDB that reminded that I didn’t mention that the stored procedures for the pages that I am now writing to disk use the tembDB quite a bit. Basically, what I do is I have a couple of tables with info in them that I reformat into a temp table and return back to the page so that each column of the temp table corresponds to a column in the table I’ll be putting in the HTML on the page. Is this useage bad? Sounds like the client side server example you gave may be good for these pages. I will look into this now. Thanks!
Larry
Ugh! I just found out that my Hosting company had give SQL Server a max of 152MB of RAM!
They’ve now bumped it up to 250MB.
Correction to my post above:
I AM doing a DB.close in my cleanup include file, so no connections should be remaining open.
Hi Larry,<br /><br />152MB is not a lot <img src=’/community/emoticons/emotion-5.gif’ alt=’;-)’ /><br /><br />Don’t you still need to make sure that record sets are closed off too? <br /><br />Cheers<br />Twan
Hi there…
My recordset are closed in the same clean-up script. What’s the best way to connect to the SQL Server that will be on my IIS machine?
Using Query analyzer or open ISQL session. _________
Satya SKJ
Moderator
SQL-Server-Performance.Com

programatically OLEDB is better than ODBC, for .NET use the SQL Server provider rather than OLEDB Cheers
Twan
So my connections string would look like this?<br /><br />"Provider=SQLOLEDB;Data Source=localhost;Initial Catalog=mydatabase;User Id=myuid<img src=’/community/emoticons/emotion-4.gif’ alt=’;P’ />assword=mypassword"
yep that’s right Twan
]]>