[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]

Re: optimising filesystem for many small files

Viji V Nair wrote:

System : Fedora 11 x86_64
Current Filesystem: 150G ext4 (formatted with "-T small" option)
Number of files: 50 Million, 1 to 30K png images

We are generating these files using a python programme and getting very slow IO performance. While generation there in only write, no read. After generation there is heavy read and no write.

I am looking for best practices/recommendation to get a better performance.

Any suggestions of the above are greatly appreciated.


I would start with using blktrace and/or seekwatcher to see what your IO patterns look like when you're populating the disk; I would guess that you're seeing IO scattered all over.

How you are placing the files in subdirectories will affect this quite a lot; sitting in 1 directory for a while, filling with images, before moving on to the next directory, will probably help. Putting each new file in a new subdirectory will probably give very bad results.


[Date Prev][Date Next]   [Thread Prev][Thread Next]   [Thread Index] [Date Index] [Author Index]