SQL Server Performance

.sql file is too large

Discussion in 'SQL Server 2005 General Developer Questions' started by danijel cecelja, Feb 4, 2009.

  1. danijel cecelja New Member

    we are working on migrating data from one DB to another, as part of that I have made a tool that reads the data and generates an .sql file from it.
    It looks a bit like this:delete from dbo.BLWIJournalCodes where DATAAREAID='wmp' GO
    7,12614,'',1,'wmp',1,12615 UNION ALL SELECT 7,12616,'',1,'wmp',1,12617 UNION ALL SELECT 7,12619,'',1,'wmp',1,12620 UNION ALL SELECT 7,12621,'',1,'wmp',1,12622 UNION ALL SELECT 7,12624,'',1,'wmp',1,12625 UNION ALL SELECT 7,12627,'',1,'wmp',1,12628 UNION ALL SELECT 7,12629,'',1,'wmp',1,12630 UNION ALL SELECT 7,12632,'',1,'wmp',1,12633 UNION ALL SELECT 7,12634,'',1,'wmp',1,12635 UNION ALL SELECT 7,12636,'',1,'wmp',1,12637 UNION ALL SELECT 7,12638,'',1,'wmp',1,12639 UNION ALL SELECT 7,12644,'',1,'wmp',1,12645 UNION ALL SELECT 7,41504,'',1,'wmp',1,41505 UNION ALL SELECT 7,41504,'',1,'wmp',1,41507 UNION ALL SELECT 7,41508,'',1,'wmp',1,41509 UNION ALL SELECT 7,41511,'',1,'wmp',1,41512 UNION ALL SELECT 7,41514,'',1,'wmp',1,41515 UNION ALL ...and so on
    After about 200 union alls, i write a COMMIT GO BEGINTRANSACTION, to flush memory, and then the query goes on. Problem is, that when you have 2 million records, the query is so large, you cant open it in the query editor anymore, but he file is there, it just needs to be executed!
    Maybe someone has an idea how to handle this problem?
  2. FrankKalis Moderator

    You don't need to open a file to get it executed. You can use the command line SQLCMD tool for that purpose.
  3. walidha New Member

    Why don't you use SSIS (SQL2005, 2008) or DTS(SQL2000) instead of inserting the data this way?
    You could store the data into a csv file and then import it into your table, couldn't you?
  4. danijel cecelja New Member

    Yes, we are trying it that way now. Thank you for the replies.

Share This Page