Create 200-300 files/sec based on query | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

Create 200-300 files/sec based on query

I have a somewhat intense webapplication that needs a very rapid refresh of data, imagine a stock market where data changes constantly and I need to display as close to real-time data as possible about each share in the market. Right now I fetch the data directly from the database on-the-fly but when more and more users come to our website the strain on the database will be quite large. All users get the same information about each share, but because the data changes all the time I always have to get it from the database. Now what I was thinking was to create html-files of the data instead, say every second (any less would not be good enaugh). Is it at all possible to create/update 2-300 files every second and how would I go about doing something like that? The great thing about this is that the strain on the database would be constant no matter how many users I have to my website…thus creating a more stable and reliable website, and I could be focusing on other things like optimizing transactions and such. Is this a good idea or are there other ways that are better…? —
Frettmaestro
"Real programmers don’t document, if it was hard to write it should be hard to understand"
Yes you can do that by using XML with SQL, where XML is compatible with SQL 2K.
More information fromhttp://www.sqlxml.org link. HTH
Satya SKJ
Moderator
http://www.SQL-Server-Performance.Com/forum
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
Yes I am aware that it is possible but will it be possible to create the amount of files I need at this interval? I have a feeling that doing this will be quite stressful…but then again I haven’t tried yet so I really don’t know… —
Frettmaestro
"Real programmers don’t document, if it was hard to write it should be hard to understand"
I can’t compete the generation of files to the amount you’re looking but its possile when we succeeded similar setup using SYBASE release 11 few of years ago. You cant’ be sure until you take up the challenge to produce the content. Satya SKJ
Moderator
http://www.SQL-Server-Performance.Com/forum
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
On the face of it generating 200-300 files every second may be possible from a database generation point of view (although adding a considerable load to the disk) BUT it will give you problems with people trying to read the files at the same time, you may get file level locking issues. Just requesting the info out of the database would possibly be more robust and quicker than a file based solution (I think) If you are using .NET to develop this, then you can set up a cache in the web application which is refreshed whenever the SQL data changes. Clients would then get the updates whenever they asked for them. If you wanted to push information out to the clients, then a normal browser based solution may not be the way to go, as it goes against the web paradigm of user requesting data. If you do need to do that, then you could consider
– one client with multiple monitors (LAN based only)
– Queued components (controlled network not Internet)
– multi-cast push of the content (could extend to the Internet) MS used to promote a phrase Digital Dashboard, but I have no idea what happened to this…
Cheers
Twan
In the ideal market on our website (internet) the data in one single market will change 10-20 times/sec and we do have about 2-300 different markets running at the same time. If 400 users are viewing one of the markets with an average refresh of about once every 5 seconds we will have 80 requests to the database each second plus with the 10-20 transactions. This will offcourse not happen for a while, but I was thinking that creating a file for each market you will only get one request per market instead of 80… I’m really not sure what would be the best way here, It’s only that if I could get the file-generation thing to work the load on the server would be constant no matter how many users we have on the website, and that would be really comforting. —
Frettmaestro
"Real programmers don’t document, if it was hard to write it should be hard to understand"
I do not recommend the file approach. I think you will have disk issues that you will have to contend with, maybe not right away, but eventually. In this situation I would build a database poller that contains the database data with data being updated based on some interval. Let the web site then connect to the database poller and pull the data. If you are using .NET you have many options that make the database poller very simple and able to support high concurrency and performance.
"How do you expect to beat me when I am forever?"
So what you are saying is to create a table or something with the processed data that will be displayed on the website, and then just fetch the "raw" data from there…?? Hmm…now why didn’t I think of that… <img src=’/community/emoticons/emotion-1.gif’ alt=’:)‘ /><br /><br />–<br />Frettmaestro<br />"Real programmers don’t document, if it was hard to write it should be hard to understand"
Yes its an alternative solution but ensure the table is not fragmented with all the data, ensure to clear up once the transaction is completed. Above all make sure to schedule regular DBCC checks to keep up the performance. Satya SKJ
Moderator
http://www.SQL-Server-Performance.Com/forum
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
]]>