SQL Server Performance Forum – Threads Archive
bulk transferI could see that sql server’s bulk import/export largely speed up bulk data transfer, but looks to me this has to be done via creating additional data file. Think of this scenario, I have a large bulk of data, and after some massaging for example dimensional, and now I want to save it to a sqlserver 2000 table. Do I have to first dump to a file, and then import them again? This doesn#%92t sound like it will faster than plain jdbc/batch processes thinking of these â€œfileâ€ things. I need this to be embedded into an application, one could write a small c++ app to access the table via odbc. Is there a way to build a pipeline stream to write the records directly to the table without using jdbc, but via native way? If so, how to do it? Your advice are greatly appreciated. xz
If you use .NET 2.0, you can use the SqlBulkCopy class to do this. Mladen has a good blog post about it here: http://weblogs.sqlteam.com/mladenp/archive/2006/08/26/11368.aspx If you pass SqlBulkCopy the return of SqlCommand.ExecuteReader(), it does exactly what you are asking for. SqlSpec – a fast, cheap, and comprehensive data dictionary generator
for SQL Server 2000 and 2005 and Analysis Server 2005 – www.elsasoft.org
thanks for a quick response. I am going to check that blog. However, I need this in a j2ee environment.