Python os.walk memory issue


December 2018


611 time


I programmed a scanner that looks for certain files on all hard drives of a system that gets scanned. Some of these systems are pretty old, running Windows 2000 with 256 or 512 MB of RAM but the file system structure is complex as some of them serve as file servers.

I use os.walk() in my script to parse all directories and files.

Unfortunately we noticed that the scanner consumes a lot of RAM after some time of scanning and we figured out that the os.walk function alone uses about 50 MB of RAM after 2h of walk over the file system. This RAM usage increases over the time. We had about 90 MB of RAM after 4 hours of scanning.

Is there a way to avoid this behaviour? We also tried "betterwalk.walk()" and "scandir.walk()". The result was the same. Do we have to write our own walk function that removes already scanned directory and file objects from memory so that the garbage collector can remove them from time to time?

resource usage over time - second row is memory


3 answers


вы пробовали модуль Глоба?

import os, glob

def globit(srchDir):
    srchDir = os.path.join(srchDir, "*")
    for file in glob.glob(srchDir):
        print file

if __name__ == '__main__':
    dir = r'C:\working'

Если вы работаете в os.walkцикле, del()все , что вам не нужно больше. И попробуйте запустить gc.collect()в конце каждой итерации os.walk.


Генераторы лучшие решения, как они делают ленивые вычисления здесь один пример реализации.

import os
import fnmatch

#this may or may not be implemented
def list_dir(path):
    for name in os.listdir(path):
        yield os.path.join(path, name)

#modify this to take some pattern as input 
def os_walker(top):
    for root,dlist,flist in os.walk(top):
        for name in fnmatch.filter(flist, '*.py'):
            yield os.path.join(root, name)

all_dirs = list_dir("D:\\tuts\\pycharm")

for l in all_dirs:
    for name in os_walker(l):

Благодаря Дэвид Бизли