Docker Slow Ext4 Partition

Hence your host machine should have ext4 partition where docker is running and creating containers/ images from. $ sudo dockerd -storage-opt dm.basesize=50G. To let the changes take effect, you have to run the following commands: $ sudo service docker stop $ sudo rm -rf /var/lib/docker $ sudo service docker start. Was still showing 50G of 50G used in the root partition but only showed 9G of file usage. Ran a live ubuntu cd and unmounted the offending 50G partition, opened terminal and ran xfscheck and xfsrepair on the partition. I then remounted the partition and my lost+found directory had expanded to 40G. Run Docker From Ext4 Partition Windows 10 Installed On. Fortunately, I have another hard drive with Windows 10 installed on it. So, I can edit my config.plist in the EFI partition and now it works again. But, there is an easy way to mount the macOS EFI partition under Windows 10. Saw a weird issue. I recently had a case where the / partition was running very low on disk space, but I also had an additional disk mounted at /home/ with plenty of disk space. However as by default Docker stores everything at /var/lib/docker my / partition was nearly full. To fix that I moved the default /var/lib/docker to another directory on the /home partition.

  1. Docker Slow Ext4 Partition Free
  2. Docker Slow Ext4 Partition Software
  3. Docker Slow Ext4 Partition File
  4. Docker Slow Ext4 Partition Free

TL;DR : you can’t, at least for now

Long Version:

Docker slow ext4 partition manager

Docker Slow Ext4 Partition Free

Docker

So, i learning docker recently, and i notice that docker took quite a lot of space on my drive for just random image that i pull from dockerhub (yes, just for curiosity). I just being considerate my tiny weeny ssd write cycle. After my previous post on here about almost same problem, now i try to do that on linux.

After struggling with this problem for 2 days and much stack overflow later, the best solution (or most recommended) was

  1. Make a new file called daemon.json on /etc/docker
  2. And just fill it with this

Great, we’re done.

But wait

What ? okay, calm down, what could probably be the problem here, it mention about overlay stuff, what was that ?

So, apparently, docker use some kind of storage driver for writing to container writable layer. I’m not expert on docker yet, but in this case, probably docker was trying to write that image on our container, but because our container location was on ntfs file system, it has some problem.

The default storage driver being used was overlay2, this is supported for newest docker version. The storage drive that might work is vfs driver or overlayFS, but it might rather slow, as mentioned in this issue.

Apparently, there is something called backing-filesystem that are know for incompatible with docker distribution, ntfs was one of them (check here). Honestly, i din’t really need that fast, but I want it, because I can ;)

Docker Slow Ext4 Partition Software

My solution ? Since i got 1 TB hard drive, i actually planned to convert whole drive filesystem to ext4 to fulfill my self perfectionist because i don’t like many partition on my drive. But since my drive was 60% occupied, that would take an ages (not really, but just so long). Fast solution, shrink main ntfs partition, create new ext4 partition.

Done, now create a mount point for this drive, you can totally do that by changing file /etc/fstab, but because i want to make my life easier, just do that on Disk application.

Docker Slow Ext4 Partition File

and now change the daemon.json, but first stop the docker daemon

edit /etc/docker/daemon.json

restart the docker daemon

Docker Slow Ext4 Partition Free

Yeahh, we’re done boys