SQL Server Performance

General Error DTS Tranform Data Task

Discussion in 'SQL Server DTS-Related Questions' started by cncncn, Sep 23, 2007.

  1. cncncn New Member

    I'm trying to load data for a table from DB1 to DB2 using Transform Data task in DTS
    no of rows to be exported = 8 million
    Database recovery model is simple.
    I keep getting this error
    Error string: General error -2147221481 (80040017)
    after it inserts 4.5 million records. I'm attaching the log file and the exception file.
    Exception file:
    DATA TRANSFORMATION SERVICES: Data Pump Exception Log

    Package Description: (null)
    Package ID: {48AE0A59-DCC8-469C-9F7A-D457FD8BCCFC}
    Package Version: {18413A5D-84DD-46EA-AC69-2C6492CDE960}
    Step Name: LOAD_LogEventData
    Execution Started: 9/22/2007 10:49:44 AM
    Execution Completed: 9/22/2007 2:21:26 PM
    LOG FILE:
    DTSRun OnError: LOAD_LogEventData,
    Error = -2147221481 (80040017)
    Error string: General error -2147221481 (80040017).
    Error source: Microsoft Data Transformation Services (DTS) Package
    Help file: sqldts80.hlp Help context: 4700
    Error Detail Records:Error: -2147221481 (80040017); Provider Error: 0 (0)
    Error string: Error source: Microsoft Data Transformation Services (DTS) Package
    Help file: sqldts80.hlp Help context: 4700
    DTSRun OnFinish: LOAD_LogEventDataError: -2147220440 (80040428);
    Provider Error: 0 (0)
    Error string: Package failed because Step 'LOAD_LogEventData' failed.
    Error source: Microsoft Data Transformation Services (DTS) Package
    Help file: sqldts80.hlp Help context: 700DTSRun: Loading...DTSRun: Executing...ProgressCount = 554654000

    i used different batch sizes 1 / 1000 / 10000 with no luck.
    I'm using tablock option with fast load option
  2. ndinakar Member

    I cannot interpret the error message but if you used BCP you'd be done a lot faster. DTS is painfully slow. You can do 8 mil row transfer in < 1hr or even < 30 mins depending on your hardware and network bandwidth if you have to copy the bcp files across servers. DTS will easily take a nice couple of happy hours.
  3. satya Moderator

    It is better if you can enable PACKAGE LOGGING for this DTS package to see where it is failing and why.
  4. cncncn New Member

    package logging didn't help.
  5. satya Moderator

    How does transaction log responding to this process?
    Have you monitored the growth in Tlog files too, also what is the recovery model adopted for this taks.
  6. cncncn New Member

    Tran log growth is cool since I'm using simple recovery model.
    CUP util / memory availability all looks good.
    I ran into different kind of errors:
    1)Error during Transformation 'DTSTransformation__5' for Row number 8343973. Errors encountered so far in this task: 1. 275142909|2665||| ô|zn |||||
    This error says this specific row had transformation issue, but thats not true. I was able to copy this record from the source DB to destination DB
    2)Error Description:TransformCopy 'DTSTransformation__5' conversion error: General conversion failure on column pair 1 (source column 'svValue' (DBTYPE_VARIANT), destination column 'svValue' (DBTYPE_VARIANT)).
    Error Help File:
    Error Help Context ID:30501
    vȱ|% |ÿÿ|| w Yã@|ÿÿ|ÿÿ| |ms|
    Same deal here..Rerun went through fine.
    Also from what I learned from browing various forums , for copying 800 million rows between database using BULK INSERT is a better option
    since there is no mem-mem copy involved in BCP. Both the source and the destination dbs are in the same server; so, I thought it
    doesn't make sense to use BCP.


  7. satya Moderator

    By default DTS uses the bulk insert process, but for such an activity using BCP is good as you can control the process with a code and also in BULK INSERT too.

Share This Page