2

I have an existing DB called MyDb in Azure SQL Server. I have a bacpac of another DB with several tables in it. I'm interested in importing one single table (that table has no FK, it makes things easier) into a dedicated table MyDb.dbo.ImportedTable. The final goal is to be able to do some data reconstruction using that table.

Problems are:

  • MyDb.dbo.ImportedTable is ~60 Gb large
  • The main column in that table is a NVARCHAR(MAX). That forbids me to use Elastic queries in Azure. It times out since Elastic queries hates anything larger than NVARCHAR(4000) (I tried)

I guess a good approach is: 1. Use BCP but I only have the binary *.bcp files (15'000 of them) that are inside the bacpac archive (opened as a zip, in its data folder)

But I'm unable to make it work, especially because I find no documentation about the *.bcp file format used in the bacpac.

tl;dr What is the good approach to import a single ~60Gb table fro ma bacpac in an existing database in azure SQL Server?

Askolein
  • 131
  • 3

1 Answers1

1

It seems there is no way to simply use the .bac files fomr inside the .bacpac archive. I had then to revert to bcp to create a text file that will be the source of another bcp command to create import the dat in the new DB I want.

To export the data and create the file:

bcp "SELECT * FROM $table WHERE someColumn = someValue" queryout "${table}.csv" -U $user -P $password -S $server -d $dbSource -N -k

And then to reimport it as a new table

bcp [$dbTarget].[dbo].[$table] IN "Fresh${table}.csv" -U $user -P $password -S $server -E -N -k

Askolein
  • 131
  • 3