Python Multiprocessing appending list(Python多处理附加列表)
问题描述
关于使用 Multiprocessing.Pool() 在多个进程之间共享变量的问题.
Have a quick question about a shared variable between multiple processes using Multiprocessing.Pool().
如果我从多个进程中更新全局列表,我会遇到任何问题吗?IE.如果两个进程同时尝试更新列表.
Will I run in to any issues if I am updating a global list from within multiple processes? I.e. if two of the processes were to try to update the list at the same time.
我看过关于在类似事情上使用锁的文档,但我想知道是否有必要.
I have seen documentation about using a Lock for similar things but I was wondering if it was necessary.
我共享这个变量的方式是在我的回调函数中使用一个全局变量,'success' 在目标函数完成后,我将所有成功的操作附加到:
The way I am sharing this variable is by using a global variable in my callback function, 'successes' in which i append all of the successful actions to after the target function has completed:
TOTAL_SUCCESSES = []
def func(inputs):
successes = []
for input in inputs:
result = #something with return code
if result == 0:
successes.append(input)
return successes
def callback(successes):
global TOTAL_SUCCESSES
for entry in successes:
TOTAL_SUCCESSES.append(entry)
def main():
pool = mp.Pool()
for entry in myInputs:
pool.apply_async(func, args=(entry,),callback=callback)
为任何语法错误道歉,很快就写出来了,但是程序正在运行,只是想知道如果我有问题我是否添加了共享变量.
Apologize for any syntax errors, wrote this up quickly but the program is working just wondering if I add the shared variable if I will have issues.
提前致谢!
推荐答案
使用您当前的代码,您实际上并没有在进程之间共享 CURRENT_SUCCESSES
.callback
在主进程中的结果处理线程中执行.只有一个结果处理线程,因此每个 callback
将一次运行一个,而不是同时运行.所以你写的代码是进程/线程安全的.
With your current code, you're not actually sharing CURRENT_SUCCESSES
between processes. callback
is executed in the main process, in a result handling thread. There is only one result handling thread, so each callback
will be run one at a time, not concurrently. So your code as written is process/thread safe.
但是,您忘记从 func
中返回 success
,这是您想要修复的.
However, you are forgetting to return successes
from func
, which you'll want to fix.
此外,使用 map
可以更简洁地编写:
Also, this could be much more succinctly written using map
:
def func(inputs):
successes = []
for input in inputs:
result = #something with return code
if result == 0:
successes.append(input)
return successes
def main():
pool = mp.Pool()
total_successes = pool.map(func, myInputs) # Returns a list of lists
# Flatten the list of lists
total_successes = [ent for sublist in total_successes for ent in sublist]
这篇关于Python多处理附加列表的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持编程学习网!
本文标题为:Python多处理附加列表
基础教程推荐
- 使用PyInstaller后在Windows中打开可执行文件时出错 2022-01-01
- 如何在海运重新绘制中自定义标题和y标签 2022-01-01
- 用于分类数据的跳跃记号标签 2022-01-01
- Python kivy 入口点 inflateRest2 无法定位 libpng16-16.dll 2022-01-01
- 筛选NumPy数组 2022-01-01
- Dask.array.套用_沿_轴:由于额外的元素([1]),使用dask.array的每一行作为另一个函数的输入失败 2022-01-01
- 线程时出现 msgbox 错误,GUI 块 2022-01-01
- 何时使用 os.name、sys.platform 或 platform.system? 2022-01-01
- 在 Python 中,如果我在一个“with"中返回.块,文件还会关闭吗? 2022-01-01
- 如何让 python 脚本监听来自另一个脚本的输入 2022-01-01