Running multiprocessing inside decorator(在修饰器内部运行多进程)
问题描述
我想更新一下关于装饰器内部多进程的问题(我之前的问题对我来说似乎已经死了:)。我偶然发现了这个问题,不幸的是,我不知道如何解决这个问题。为了我的应用程序的需要,我不得不在装饰器中使用多进程,但是...当我在修饰器内部使用多进程时,我收到错误:
Can't pickle <function run_testcase at 0x00000000027789C8>: it's not found as __main__.run_testcase
。
另一方面,当我像调用正常函数wrapper(function,*arg)
那样调用我的多处理函数时,它起作用了。这是非常棘手的,但我不知道我做错了什么。我几乎可以得出这样的结论:这是python错误:)。也许有人知道这个问题的解决方法,只保留相同的语法。我在Windows上运行此代码(不幸的是)。
上一个问题:Using multiprocessing inside decorator generates error: can't pickle function...it's not found as
模拟此错误的最简单代码:
from multiprocessing import Process,Event
class ExtProcess(Process):
def __init__(self, event,*args,**kwargs):
self.event=event
Process.__init__(self,*args,**kwargs)
def run(self):
Process.run(self)
self.event.set()
class PythonHelper(object):
@staticmethod
def run_in_parallel(*functions):
event=Event()
processes=dict()
for function in functions:
fname=function[0]
try:fargs=function[1]
except:fargs=list()
try:fproc=function[2]
except:fproc=1
for i in range(fproc):
process=ExtProcess(event,target=fname,args=fargs)
process.start()
processes[process.pid]=process
event.wait()
for process in processes.values():
process.terminate()
for process in processes.values():
process.join()
class Recorder(object):
def capture(self):
while True:print("recording")
from z_helper import PythonHelper
from z_recorder import Recorder
def wrapper(fname,*args):
try:
PythonHelper.run_in_parallel([fname,args],[Recorder().capture])
print("success")
except Exception as e:
print("failure: {}".format(e))
from z_wrapper import wrapper
from functools import wraps
class Report(object):
@staticmethod
def debug(fname):
@wraps(fname)
def function(*args):
wrapper(fname,args)
return function
正在执行:
from z_report import Report
import time
class Test(object):
@Report.debug
def print_x(self,x):
for index,data in enumerate(range(x)):
print(index,data); time.sleep(1)
if __name__=="__main__":
Test().print_x(10)
我将@Wraps添加到以前的版本
我的回溯:
Traceback (most recent call last):
File "C:InterpretersPython32libpickle.py", line 679, in save_global
klass = getattr(mod, name)
AttributeError: 'module' object has no attribute 'run_testcase'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "C:EskyTestsw_Logger.py", line 19, in <module>
logger.run_logger()
File "C:EskyTestsw_Logger.py", line 14, in run_logger
self.run_testcase()
File "C:EskyTestsw_Decorators.py", line 14, in wrapper
PythonHelper.run_in_parallel([function,args],[recorder.capture])
File "C:EskyTestsw_PythonHelper.py", line 25, in run_in_parallel
process.start()
File "C:InterpretersPython32libmultiprocessingprocess.py", line 130, in start
self._popen = Popen(self)
File "C:InterpretersPython32libmultiprocessingforking.py", line 267, in __init__
dump(process_obj, to_child, HIGHEST_PROTOCOL)
File "C:InterpretersPython32libmultiprocessingforking.py", line 190, in dump
ForkingPickler(file, protocol).dump(obj)
File "C:InterpretersPython32libpickle.py", line 237, in dump
self.save(obj)
File "C:InterpretersPython32libpickle.py", line 344, in save
self.save_reduce(obj=obj, *rv)
File "C:InterpretersPython32libpickle.py", line 432, in save_reduce
save(state)
File "C:InterpretersPython32libpickle.py", line 299, in save
f(self, obj) # Call unbound method with explicit self
File "C:InterpretersPython32libpickle.py", line 623, in save_dict
self._batch_setitems(obj.items())
File "C:InterpretersPython32libpickle.py", line 656, in _batch_setitems
save(v)
File "C:InterpretersPython32libpickle.py", line 299, in save
f(self, obj) # Call unbound method with explicit self
File "C:InterpretersPython32libpickle.py", line 683, in save_global
(obj, module, name))
_pickle.PicklingError: Can't pickle <function run_testcase at 0x00000000027725C8>: it's not found as __main__.run_testcase
推荐答案
multiprocessing
模块通过调用从进程上的选取器来"调用"从进程中的函数。这是因为它必须通过它创建的IPC接口将函数的名称发送到从进程。Pickler找出要使用的正确名称并将其发送,然后在另一端UnPickler将该名称转换回函数。
如果函数是类成员,则在没有帮助的情况下不能正确地对其进行酸洗。@staticmethod
成员的情况更糟,因为它们的类型是function
,而不是instancemethod
类型,这会愚弄Pickler。无需使用multiprocessing
:
import pickle
class Klass(object):
@staticmethod
def func():
print 'func()'
def __init__(self):
print 'Klass()'
obj = Klass()
obj.func()
print pickle.dumps(obj.func)
生产:
Klass()
func()
Traceback (most recent call last):
...
pickle.PicklingError: Can't pickle <function func at 0x8017e17d0>: it's not found as __main__.func
当您尝试Pickle像obj.__init__
这样的常规非静态方法时,问题会更加明显,因为Pickler会意识到它实际上是一个实例方法:
TypeError: can't pickle instancemethod objects
然而,并不是一切都完了。您只需要添加一个间接级别。您可以提供一个在目标进程中创建实例绑定的普通函数,向它发送至少两个参数:(Pickle-able)类实例和函数的名称。我还添加了在调用函数时要使用的任何参数,以确保完整性。然后在目标流程中调用这个普通函数,它调用类的成员函数:
def call_name(instance, name, *args = (), **kwargs = None):
"helper function for multiprocessing: call instance.getattr(name)"
if kwargs is None:
kwargs = {}
getattr(instance, name)(*args, **kwargs)
现在不是(这是从您的链接帖子复制的):
PythonHelper.run_in_parallel([self.run_testcase],[recorder.capture])
您应该这样做(您可能想要在调用序列上做文章):
PythonHelper.run_in_parallel([call_name, (self, 'run_testcase')],
[recorder.capture])
(注意:这都是未经测试的,可能存在各种错误)。
更新
我使用了您发布的新代码并进行了试用。
首先,我必须修复z_report.py
中的缩进(取消缩进所有class Report
)。
完成后,运行它会产生与您显示的错误完全不同的错误:
Process ExtProcess-1:
Traceback (most recent call last):
File "/usr/local/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
self.run()
File "/tmp/t/marcin/z_helper.py", line 9, in run
Process.run(self)
File "/usr/local/lib/python2.7/multiprocessing/process.py", line 114, in run
recording
[infinite spew of "recording" messages]
修复无休止的"录制"消息:
diff --git a/z_recorder.py b/z_recorder.py
index 6163a87..a482268 100644
--- a/z_recorder.py
+++ b/z_recorder.py
@@ -1,4 +1,6 @@
+import time
class Recorder(object):
def capture(self):
- while True:print("recording")
-
+ while True:
+ print("recording")
+ time.sleep(5)
剩下一个问题:print_x
的错误参数:
TypeError: print_x() takes exactly 2 arguments (1 given)
在这一点上,Python实际上为您做了所有正确的事情,只是z_wrapper.wrapper
有点过分了:
diff --git a/z_wrapper.py b/z_wrapper.py
index a0c32bf..abb1299 100644
--- a/z_wrapper.py
+++ b/z_wrapper.py
@@ -1,7 +1,7 @@
from z_helper import PythonHelper
from z_recorder import Recorder
-def wrapper(fname,*args):
+def wrapper(fname,args):
try:
PythonHelper.run_in_parallel([fname,args],[Recorder().capture])
print("success")
这里的问题是,当您读到z_wrapper.wrapper
时,函数参数已经全部捆绑到一个元组中。z_report.Report.debug
已有:
def function(*args):
以便将两个参数(在本例中为main.Test
的实例和值10
)制成一个元组。您只希望z_wrapper.wrapper
将该(单个)元组传递给PythonHelper.run_in_parallel
,以提供参数。如果添加另一个*args
,该元组将被包装到另一个元组中(这次是一个元素)。(您可以通过在z_wrapper.wrapper
中添加print "args:", args
来查看这一点。)
这篇关于在修饰器内部运行多进程的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持编程学习网!
本文标题为:在修饰器内部运行多进程
基础教程推荐
- 哪些 Python 包提供独立的事件系统? 2022-01-01
- 如何在Python中绘制多元函数? 2022-01-01
- 合并具有多索引的两个数据帧 2022-01-01
- 使 Python 脚本在 Windows 上运行而不指定“.py";延期 2022-01-01
- 使用Python匹配Stata加权xtil命令的确定方法? 2022-01-01
- 如何在 Python 中检测文件是否为二进制(非文本)文 2022-01-01
- 使用 Google App Engine (Python) 将文件上传到 Google Cloud Storage 2022-01-01
- 症状类型错误:无法确定关系的真值 2022-01-01
- Python 的 List 是如何实现的? 2022-01-01
- 将 YAML 文件转换为 python dict 2022-01-01