Hi All, I'm busy trying to implement an SSIS package solution to move data from 100s of tables in an operational database into an archive database. The 2 databases have exactly the same structure in terms of tables etc. Currently each table in the operational database has a flag column which I use to expire data when new data is loaded. I have a table that list the tables in the database along with an archive flag to show whether or not a table needs archiving. My SSIS package currently iterates through all the tables that need archiving (in a foreach loop container). For each table I check in the archive database if we already have data for a date that we are trying to archive. If so, I delete that data using an "Execute SQL task" on a "DELETE FROM ... WHERE..." query that I build using a script component. I then copy the data across to the archive database using an "Execute SQL task" on an "INSERT INTO ... SELECT FROM" query that I build using a script component. I then do a row count check to make sure all the data has been transferred and then delete the data from the operational database. I then write an entry to a logging table stating how many rows where archived and for what dates. The deletes are taking a long time and I need to speed the whole process up. One thing I'm about to try is to do the deletes in batches to avoid filling the log. I've been doing some reading up around improving performance on SSIS and it seems as though many optimisations can be done if data flow components are used. However my solution doesn't use any. I'm looking for any suggestions I can get to improve the performance. I don't mind starting from scratch if SSIS is not the right way to go. Thanks in advance.