1

I want to import a database from a .bacpac file to a SQL Server Managed Instance.

I have read SqlPackage Import parameters and properties in the documentation. It says that there is a flag DatabaseMaximumSize=(INT32).

I wanted to know if there's a limit that SqlPackage can support?

For example, if I got 8 GB of RAM available, will SqlPackage be able to load larger .bacpac files than that, meaning it doesn't load it all to the memory?

I'm not talking about runtime performance, but memory allocation. What happens if the bacpac is larger than my available RAM?

There is for example dacfx in .NET Core which has an import method which receives a bacpac, but the option to tell it not to load everything to the memory doesn't work there.

Paul White
  • 94,921
  • 30
  • 437
  • 687
CodeMonkey
  • 111
  • 6

2 Answers2

1

I don't believe the amount of memory is a factor and it should be buffered if your .bacpac file is larger than the amount you have available. (This is a pretty common case that your database is larger than your memory.)

The only thing to note in hardware limitations is if the machine used to process the BACPAC is doing so via Azure Portal or PowerShell, the amount of disk space available may need to be up to triple the size of the BACPAC itself, as per the documentation.

There are a few other limitations at the end of that document, none of which are related to hardware or memory.

The data is after all BCP files in the zip file (a BACPAC is a zip file). It would be very strange if the import routine would digest everything in memory before loading it.

Paul White
  • 94,921
  • 30
  • 437
  • 687
J.D.
  • 40,776
  • 12
  • 62
  • 141
-1

Is this SQL Server on a VM or Azure SQL Database / Azure SQL Managed Instance ? Probably one of the last 2 as you could just use a backup to restore to SQL on a VM, If you find the import runs to slow just scale up your machine until the import is completed

Stephen Morris - Mo64
  • 4,656
  • 1
  • 10
  • 18