Exception Files | SQL Server Performance Forums

SQL Server Performance Forum – Threads Archive

Exception Files

I have created a package in SQL that brings in data from a flat file to SQL. In the data pump I have created an exception path to log all of the data that does not come through because of bad data such as incorrect data types etc. When I run the package and there are rows that are kicked out because of bad data they successfully are logged in the excpetion file and another text file is created to show whichs rows were rejected. The text files are pointed to a UNC path not to my local computer. My problem is when I set this package up as a job, the log file shows that there are no errors and I am obviously not getting the text file that shows the rejected rows. I am positive it is not bringing in all of the data, can you please help? Thank you.

Try running PROFILER during this operation in order to see the activity. Satya SKJ
Moderator
http://www.SQL-Server-Performance.Com/forum
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
quote:Originally posted by satya Try running PROFILER during this operation in order to see the activity. Satya SKJ
Moderator
http://www.SQL-Server-Performance.Com/forum
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.

I did a trace running the package as a job and running it manually, but I don’t even see where the exception is made. Any suggestions? Is there another way to identify rows that are not transferring because of bad data?
I ran a trace by running the package as a job and running it manually. I can’t identify where the exception is made. What should I be looking for? Is there another way to identify if rows did not transfer because of bad data? We need to somehow identify those rows, change the data and bring them in. Any help would be greatly appreciated. Thank you!
Run the query on query analyzer (QA) to get the rows to the text files as defined in DTS package. Satya SKJ
Moderator
http://www.SQL-Server-Performance.Com/forum
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
Sorry I am a bit confused, run a query based on the rows in the text file through QA? I need to be able to identify which rows were rejected so I can fix the data and run them through again.
So without running the query against the table how will you get the troubled data. Satya SKJ
Moderator
http://www.SQL-Server-Performance.Com/forum
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
This is the senerio, I have an SQL task to delete a table. I have a txt source file and an SQL connection. The data pump between the txt source file and SQL connection has an activeX transformation which transforms some of the txt data types into datetime or money data types. I need ALL of the data from the txt file to go into the defined SQL table. I have named a path in the options tab of the transform data task properties to create an exception file if any rows are rejected into the SQL table. <br /><br />When I run the package manually I get two files a txt file that states if there were any rows that did not transfer. This is the file I get:<br /><br /><font size="2">Execution Started: 06/21/2004 8:16:59 AM<br /><br />@@LogSourceRows: \Cba04clientrpSQLExceptionFilesC_GRException.txt.Source<br />@@LogDestRows: \Cba04clientrpSQLExceptionFilesC_GRException.txt.Dest<br />@@ErrorRow: 327<br />Error during Transformation ‘DTSTransformation__1’ for Row number 327. Errors encountered so far in this task: 1.<br />Error Source: Microsoft Data Transformation Services (DTS) Data Pump<br />Error Description:ActiveX Scripting Transform ‘DTSTransformation__1’ encountered an invalid data value for ‘GRTODATE’ destination column.<br />Error Help File<img src=’/community/emoticons/emotion-7.gif’ alt=’:s’ />qldts80.hlp<br />Error Help Context ID:30632<br />@@SourceRow: Logging Failed<br />Error code: 8007000E<br />@@DestRow: Not Available<br />@@ExecutionCompleted<br />Execution Completed: 06/21/2004 8:17:01 AM</font id="size2"><br /><br />I also get a txt.Source file which populates the rows that were rejected. When I run this same package as a job I do not the get the txt.Source file and I get the following in the txt file that does not show that there were any errors. <br /><br />Execution Started: 6/21/2004 10:25:26 AM<br />@@LogSourceRows: \Cba04clientrpSQLExceptionFilesC_GRException.txt.Source<br />@@LogDestRows: \Cba04clientrpSQLExceptionFilesC_GRException.txt.Dest<br />@@ExecutionCompleted<br />Execution Completed: 6/21/2004 10:25:30 AM<br /><br />I just can’t figure out what it should be different. The package was created on the server, not on my local machine, and the exception file is pointing to a UNC file name that is also on the server.
Error 8007000E – relates to some sort of memory issues while running this DTS package.
What is the memory setting on SQL? Ensure you’ve latest service pack on SQL & OS. Satya SKJ
Moderator
http://www.SQL-Server-Performance.Com/forum
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
I actually want the error. I know what the error is, it is rejecting one row that has an invalid date type. That is my problem, I want it to show the same error code when I run it as a job, and it is not showing up as rejecting any rows. Yet when I go into the table it has rejected that one row that had the bad date. This is a big problem when we are running our large data transfers and there are hundreds of rows that are being rejected, but I am not getting any exception files letting me know which rows are being rejected and why they are being rejected. We will need to fix those rejected records and transfer them back in, but if we don’t know which records are being rejected we cannot do that.
What is the rowcount on the table?
Can you list the data on the troubled table?
Also confirm above points. Satya SKJ
Moderator
http://www.SQL-Server-Performance.Com/forum
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
The source file has 825 records, the destination table has 824 records. This clearly shows that there was one rejected row, but the exception file that is created when running this as a job shows that there were no errors.
Any ideas?
Can you cross-check data from flat file and rows imported in order to ensure no issues with import. Satya SKJ
Moderator
http://www.SQL-Server-Performance.Com/forum
This posting is provided “AS IS” with no rights for the sake of knowledge sharing.
I don’t think I am explaining this clearly enough. There is one row that is not transferring. I have cross checked the destination file with the source file and all rows transferred except for one because one of the fields is going into a table that has a date data type and the value in the field is 02/30/04 which is an invalid date. That is fine. The problem I am having is that I need to have an exception file that lets me know that one of the rows was rejected so I can go in and fix the error and import it in again. When I run the DTS package manually I get the exception file stating that one row failed to transfer, but I would like to run this as a nightly job. When I set up the DTS package as a job the exception file does not state that there was an error when there really was an error, that same row with the bad date did not transfer.
Hi, i have noticed a very similar problem. Do you have an ‘On Transform Failure’ phase in your AxtiveX script. If so, get rid of it and see if the &lt;file name&gt;.log.source file appears. I am pulling my hair out at the moment as i have a similar problem. I, like yourself, am using the TransformTask exception file to capture rows that can’t be imported by the transformation Task. For each row that is written to the exception file, i save a Row Identifier along with an Error string to an ADO, recordset. Once the Transformation file completes i check my Recordset to see if i have errors, if so, i then have another Transformation Task that grabs the Exception file and imports each row, along with the Error string associated with the row (from my recordset), into a generic table where my user’s can update the "bad" data.<br /><br />When i run this package from a <b>DIFFERENT MACHINE</b> to where the actual DTS package (and SQL server) actualy resides it works fine. As soon as i try to run the package from the <b>MACHINE where the DTS package / SQL Server resides</b>.. BOOM.. big bummer… my &lt;file name&gt;.log.source file doesn’t get created.. it all falls over……<br /><br />If i remove the ‘On transform Failure’ script my exception file always gets created… butobviously this doesn’t really help me as i need to capture the error…..<br /><br /><img src=’/community/emoticons/emotion-6.gif’ alt=’:(‘ /><br /><br />If anyone can shed some light on this it would be very much appreciated….<br /><br />Cheers..<br /><br />Justin (nearly pulled all my hair out now) Cale.<br />Orchid Business Systems.<br /><br /><br />Time fly’s like the wind…<br />Fruit fly’s like bananas
Hey Justin, <br />I am feeling the same way, I am ready to pull my hair out. I have been working on this problem for about a month now and have gotten nowhere. I tried checking off the ‘On transform failure’ checkbox, but then the entire package failed. The package is actually creating the exception file, but it is not reflecting the error. Thanks for your insight though<img src=’/community/emoticons/emotion-1.gif’ alt=’:)‘ /><br /><br />If you find a solution let me know!<br /><br />
]]>