I manage a few tables with more than 1 billion records. Daily, their ETLs run, reading data from flat files and inserting a few thousand records on each of them.
Now I need to keep these same tables also on another DB, and of course they need to have the same data. Data will be transferred using Linked Server.
The easiest way would be to truncate the destination tables, select on source inserting to destination, and then rebuild its index. But I don't like the idea of copying the same data every day.
Another solution I see is to query both source and destination tables, distincting by date dimension, and then copy only records from dates that are missing. I'd then create a monitor to see if the amount of records per day are equal and highlight not matching days.
Is this second solution good enough, or is there a best practice for this situation?