Import .xls file | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

Import .xls file

Hi I am importing an excel file daily which is an automated process but last day I got error saying that destination column does not allow nulls. BUt when I looked in detail everything looks fine for me. The only diffrence I observe from previous loads is that this file is bigger than tha files before processed. PLs help me to load it anyway.
Thanks!
"He laughs best who laughs last"
I don’t think bigger file is a problem..
Check excel again to find some null value.
Luis Martin
Moderator
SQL-Server-Performance.com All in Love is Fair
Stevie Wonder
All postings are provided “AS IS” with no warranties for accuracy.
I am sure I dont have any duplicates or nulls, weired thing is that when I split the file into 2 and ran my job it was gud. so I concluded that thae error was due to the large number of rows. If the number of rows exceed more than 35, this job fails. Pls help me to find the resons why it happens so. Thanks!
"He laughs best who laughs last"
Look at the SOURCE data … At least one of the rows has a blank on at least one of the columns, or an incorrect format for the intended data. This is Excel, so we’re not talking about proper columns in a table. The information in a cell can be inconsistent with the general type of the data. You could have a date like 31-Feb without a year, or a number like 23e8 – and they would come over as NULL. The column where this data will be inserted does not allow NULLs, that’s the error message. Don’t bother with the number of rows, unless you have an import routine with a row counter that is declared as a data type that is too small – like TINYINT which is limited at 255.
If there is any error how does the file loads successfully when I split the same file into 2 files and load them seprately? Thanks!
"He laughs best who laughs last"
After you split, you have one batch of 35 rows, and another batch of … rows?
i split the file into 20 and 20 rows Thanks!
"He laughs best who laughs last"
You mention an automated process that processes the Excel sheet. What kind of process – DTS, BCP, INSERT based on OPENQUERY, … ?
DTS Thanks!
"He laughs best who laughs last"
Okay, then can you check the exact bits where the data is inserted into the target table. Is there value being inserted that is not coming from the Excel sheet? Also, check for INSERT triggers on the target table. Perhaps there is a trigger that was not designed for batch inserts over 35 rows.
I dont have any triggers.
upon googling i found that such type of imports are not supported by large number of rows, it can process only limited rows, not sure if tht is true? Thanks!
"He laughs best who laughs last"
IIRC, Excel sheets are limited to 65K rows or thereabouts. Where did you find that limitation? Excel has a wizard that you can use to help setup a data import, and it will make a guess of the appropriate data types based on the first 35 rows in the source. There is a registry setting that you can change to increase the number of rows used for determining the data types. But that’s a whole other issue …
]]>