0

I'm attempting to run pytorch as an Azure Container App Job. I'm being stopped by the size of the shared memory, /dev/shm.

RuntimeError: DataLoader worker (pid 30) is killed by signal: Bus error. It is possible that dataloader's workers are out of shared memory. Please try to raise your shared memory limit."}

On my workstation I can work around this by using the hosts shm:

docker run -it --rm --ipc=host -t <image name>

Any way to do that in azure? I've tried mounting emptydir on /dev/shm to no avail:

volumes:
- name: azure-files-volume
  storageType: AzureFile
  storageName: aifsfileshare
- name: workdir
  storageType: EmptyDir
- name: shm
  storageType: EmptyDir
Lars
  • 1

0 Answers0