2

I'm running a Bitnami WordPress stack on an AWS EC2 instance. Even though df -h shows there is free disk space, I can't create new files or install packages. After checking with df -ih, it looks like the root partition (/) is at 100% inode usage:

bitnami@xxx:/opt/bitnami$ df -ih
Filesystem      Inodes IUsed IFree IUse% Mounted on
udev              486K   288  486K    1% /dev
tmpfs             488K   386  488K    1% /run
/dev/nvme0n1p1    1.9M  1.9M   762  100% /
tmpfs             488K     1  488K    1% /dev/shm
tmpfs             488K     2  488K    1% /run/lock
tmpfs             488K    17  488K    1% /sys/fs/cgroup
/dev/nvme0n1p15      0     0     0     - /boot/efi
tmpfs             488K     4  488K    1% /run/user/1000

I found several commands online (like find / -xdev -type f | wc -l) to identify where the excessive number of files might be, but they run extremely slowly or fail due to lack of available resources.

What are some efficient ways to track down and remove (or archive) the files responsible for using up all the inodes, especially on a Bitnami WordPress stack? Are there known locations (e.g., logs, cache directories, etc.) on Bitnami WordPress setups that can accumulate large numbers of small files? If cleaning up files isn't enough, is resizing or reformatting the partition with a higher inode count the only long-term fix?

Thanks :)

--------------------------------------------- UPDATE (after @asktyagi suggestion):

I extended the disk by 20GB, and it seems to have worked correctly based on the df -hi output:

Filesystem        Inodes IUsed IFree IUse% Mounted on
udev                486K   288  486K    1% /dev
tmpfs               488K   384  488K    1% /run
/dev/nvme0n1p1      3.2M  1.9M  1.3M   60% /
tmpfs               488K     1  488K    1% /dev/shm
tmpfs               488K     2  488K    1% /run/lock
tmpfs               488K    17  488K    1% /sys/fs/cgroup
/dev/nvme0n1p15        0     0     0    - /boot/efi
tmpfs               488K     4  488K    1% /run/user/1000

and df -h:

Filesystem       Size  Used Avail Use% Mounted on
udev             1.9G     0  1.9G   0% /dev
tmpfs            391M  5.3M  385M   2% /run
/dev/nvme0n1p1    49G   17G   31G  36% /
tmpfs            2.0G     0  2.0G   0% /dev/shm
tmpfs            5.0M     0  5.0M   0% /run/lock
tmpfs            2.0G     0  2.0G   0% /sys/fs/cgroup
/dev/nvme0n1p15  124M  278K  124M   1% /boot/efi
tmpfs            391M     0  391M   0% /run/user/1000

0 Answers0