Hi I'm currently working on a project that involves a third party database and application. So far we are able to successfully TEST and interface data between our databases. However we are having trouble when we are extracting a large set of data (ex 100000 rows and 10 columns per row) and suddenly it stopped at the middle of transaction for whatever reason(ex blackouts, force exit or etc..), missing or duplication of data is happening in this type of scenario.
Can you please give us a suggestions to handle these types of scenarios.
Here's our current interface structure
- OurDB -> Interface DB -> 3rdParty DB
- OurDB: We are extracting records from OurDB (with
bitcolumnasfalse) to the InterfaceDb - InterfaceDB: After inserting records from OurDB, we will update OurDB
bitcolumnastrue - 3rdPartyDB: They will extract and delete all records from InterfaceDB (they assume that all records is for extraction)