Bulk insert command | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

Bulk insert command

I’m learning about this command now and as an experiment I created a local database on my machine. I then recreated a table from one of our development servers. Then I exported 100,000 rows from that table to a text file. The format of the file is comma delimited, first row has column headers and text fields are surrounded with double-quotes. Now I tried executing the following command: alter database testactivity set recovery bulk_logged
go bulk insert testactivity.dbo.activity
from ‘C:TempActivity.txt’
with (tablock, firstrow=2, fieldterminator=’,’, rowterminator=’/n’)
go alter database testactivity set recovery full
go I get the following error from this: Msg 4866, Level 16, State 8, Line 2
The bulk load failed. The column is too long in the data file for row 1, column 38. Verify that the field terminator and row terminator are specified correctly.
Msg 7301, Level 16, State 2, Line 2
Cannot obtain the required interface ("IID_IColumnsInfo") from OLE DB provider "BULK" for linked server "(null)". I’m not sure how the column data could be too long. The table is an exact duplicate of the schema from which this data came from. Perhaps I’m missing an option in my command? Rob Mills
It could be data issue…
If any one of the column data contains the comma in it sql treats as different column…
Opent the test file and see if there is any comma in your data… Mohammed U.
quote:Originally posted by MohammedU It could be data issue…
If any one of the column data contains the comma in it sql treats as different column…
Opent the test file and see if there is any comma in your data… Mohammed U.

Thanks I checked on that and no columns actually contain a comma. Rob Mills
Check your rowterminator… don’t you think it should be ‘
‘ instead of ‘/n’… Check BOL examples once again…
Mohammed U.
]]>