Python多处理 - 进程完成后如何释放内存?

Python multiprocessing - How to release memory when a process is done?(Python多处理 - 进程完成后如何释放内存?)

本文介绍了Python多处理 - 进程完成后如何释放内存?的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我在使用 python 多处理库时遇到了一个奇怪的问题.

I encountered a weird problem while using python multiprocessing library.

我的代码如下所示:我为每个符号、日期"元组生成一个进程.之后我结合结果.

My code is sketched below: I spawn a process for each "symbol, date" tuple. I combine the results afterwards.

我希望当一个进程完成对符号,日期"元组的计算时,它应该释放它的内存吗?显然情况并非如此.我看到几十个进程(尽管我将进程池的大小设置为 7)在机器中挂起¹.它们不消耗 CPU,也不释放内存.

I expect that when a process has done computing for a "symbol, date" tuple, it should release its memory? apparently that's not the case. I see dozens of processes (though I set the process pool to have size 7) that are suspended¹ in the machine. They consume no CPU, and they don't release the memory.

如何让进程在完成计算后释放其内存?

How do I let a process release its memory, after it has done its computation?

谢谢!

¹暂停"是指它们在 ps 命令中的状态显示为S+"

¹ by "suspended" I mean their status in ps command is shown as "S+"

def do_one_symbol( symbol, all_date_strings ):
    pool = Pool(processes=7)
    results = [];
    for date in all_date_strings:
        res = pool.apply_async(work, [symbol, date])
        results.append(res);

    gg = mm = ss = 0;
    for res in results:
        g, m, s = res.get()
        gg += g; 
        mm += m; 
        ss += s;

推荐答案

您是否尝试使用 pool.close 然后等待进程完成 pool.join,因为如果父进程继续运行并且不等待子进程他们会变成僵尸

这篇关于Python多处理 - 进程完成后如何释放内存?的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持编程学习网!

本文标题为:Python多处理 - 进程完成后如何释放内存?

基础教程推荐