SQL Server Performance

Managing huge database . need help?

Discussion in 'ALL SQL SERVER QUESTIONS' started by sumith, Nov 28, 2012.

  1. sumith New Member


    We have database 'testDb' with 5 tables.

    mainly 'user_details_tab' and 'user_deatils_backup_tab'.

    The database 'testdb' filled with large volume of data daily.

    The data in 'user_details_tab' moves to 'user_deatils_backup_tab' and clears 'user_details_tab' in each day.

    The problem is that,

    1.after 2 or weeks the backup 'user_deatils_backup_tab' table contain millions of data. this will effect the querying of data in that table.

    2.millions of data inserted into this database daily . In this situation i cann't imagine appliction running continously for year or more .

    note: Need to backup this database

    Any one can suggest a solution to this problems ?
  2. FrankKalis Moderator

    Welcome to the forums!
    A database is supposed to store data and most of the times it stores really large amounts of data. We have here a table with ~3 billion rows having ~50 million inserts per day. No problems at all. It is properly backed up and everything. "Millions of rows" after 2 weeks is nowhere near what I would call a large amount of data that should bring a system like SQL Server to its knees. However, it certainly is a different situation to databases that have a few 100k rows after a couple of years. It requires more careful planning upfront and all the single parts like application code, database code, database design, hardware, network, etc... need to together nicely to provide satisfying user experience even after some time.

    If you however see a significant impact on performance now, this might suggest several weaknesses in your application:
    • It could be that the whole design is not geared towards that amount of data
    • Tables and/or indexes might not be designed to provide effective and fast access to the data
    • The database server might be underpowered
    • Slow disks / slow network, etc...
    • The application could consume the data in a less efficient way
    • A combination of all the above
    You don't say that much about your system, but one thing in your situation might be to investigate into partitioning. Another thing might be to define a data retention policy, such as , for example, you keep user details for 1 year and each day you delete data older than this on a rolling basis. There are certainly other attempts, but for that we would need to know more about your application.
  3. sumith New Member

    we are running with less resources. we cannot give a dedicated server to this application(it is web application).
    PLEASE EXCUSE ME , i cannot give full details about the applications.
    i have done the indexes to columns and tables re normalised . when querying data from tables it will take more time than we expected.

    The application is refreshing every 3 or 4 minutes. There is the issue im facing.
    the query will not be executed completely in each refresh.

    please give suggestions

    1.to keep all the data(backup/ archive ).
    2.querying faster.

    THANK YOU

Share This Page