0

I have 100 pre-written SQL Server table insert statements, to pull data from a Progress database via openquery into 100 SQL Server tables e.g.

insert into table1 (col1 varchar(50), col2 varchar(10)) select X,Y from aprogresstable
insert into table1 (col1 varchar(40), col2 int, col3 varchar(15)) select X,Y,Z from anotherprogresstable
for 100 tables..

However, this is slow over openquery. Therefore, I'd like to import these data nightly from Progress into existing SQL Server tables using SSIS.

What is the best way to achieve this, without needing to create 100 data flow components (e.g. 100 sources, 100 destinations).

SQL Server Import/Export wizard is an option here but I've found it has data mapping issues with date fields.

Therefore, is it possible to create a For/Each loop to loop through 100 SQL insert statements via SSIS to pull data from Progress/Push to existing SQL Server Tables?

Thanks!

user27768
  • 25
  • 1
  • 1
  • 4

1 Answers1

0

If I was dealing with your particular scenario, I would probably end up building a "mapping" table that had the [SourceTable] and [DestinationTable] tables.

Then I would use that table as parameters that feed into a cursor which Selects from [SourceTable] and pushes the records to [DestinationTable].

Of course, this would only work if all of the columns are the same accross all tables (or if you do "INSERT INTO [DestinationTable] SELECT * FROM [SourceTable]" which I wouldn't advise).

Pros to 100 components: Modular, allows for less changes if only one or a few needs to be changed.

Cons to 100 Components: 100 components to maintain and change, especially if it's a small change that applies to all of them.

SQLDevDBA
  • 2,244
  • 10
  • 12