0

Hi I'm currently working on a project that involves a third party database and application. So far we are able to successfully TEST and interface data between our databases. However we are having trouble when we are extracting a large set of data (ex 100000 rows and 10 columns per row) and suddenly it stopped at the middle of transaction for whatever reason(ex blackouts, force exit or etc..), missing or duplication of data is happening in this type of scenario.

Can you please give us a suggestions to handle these types of scenarios.

Here's our current interface structure

  • OurDB -> Interface DB -> 3rdParty DB
  • OurDB: We are extracting records from OurDB (with bitcolumn as false) to the InterfaceDb
  • InterfaceDB: After inserting records from OurDB, we will update OurDB bitcolumn as true
  • 3rdPartyDB: They will extract and delete all records from InterfaceDB (they assume that all records is for extraction)
zxc
  • 121
  • 3

1 Answers1

1

Did you consider to run in loop and extract each time TOP 10000 (or 5000)?

It is possible only if you can identify them later for delete / bitcolumn update. For example, if you have and identity column ID, it could look like

declare @id int 
set @id = (select min(id) from yourtable where bitcolumn = false)
while exists (select top 1 1 from yourtable where bitcolumn = false)
begin
    select c1,c2 
    from yourtable
    where bitcolumn = false 
        and ID between @id and @id + 5000
    [your code here]
    update yourtable
    set bitcolumn = false
    where  ID between @id and @id + 5000

    set @id = @id + 5001
end
Liya Tansky
  • 344
  • 3
  • 14