I have an existing DB called MyDb in Azure SQL Server.
I have a bacpac of another DB with several tables in it. I'm interested in importing one single table (that table has no FK, it makes things easier) into a dedicated table MyDb.dbo.ImportedTable. The final goal is to be able to do some data reconstruction using that table.
Problems are:
MyDb.dbo.ImportedTableis ~60 Gb large- The main column in that table is a
NVARCHAR(MAX). That forbids me to use Elastic queries in Azure. It times out since Elastic queries hates anything larger thanNVARCHAR(4000)(I tried)
I guess a good approach is:
1. Use BCP but I only have the binary *.bcp files (15'000 of them) that are inside the bacpac archive (opened as a zip, in its data folder)
But I'm unable to make it work, especially because I find no documentation about the *.bcp file format used in the bacpac.
tl;dr What is the good approach to import a single ~60Gb table fro ma bacpac in an existing database in azure SQL Server?