将异步与多工作器ProcessPoolExecutor相结合

Combining asyncio with a multi-worker ProcessPoolExecutor(将异步与多工作器ProcessPoolExecutor相结合)

本文介绍了将异步与多工作器ProcessPoolExecutor相结合的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

是否可以采用work这样的阻塞函数,并使其在具有多个工作进程的ProcessPoolExecutor中并发运行?

import asyncio
from time import sleep, time
from concurrent.futures import ProcessPoolExecutor

num_jobs = 4
queue = asyncio.Queue()
executor = ProcessPoolExecutor(max_workers=num_jobs)
loop = asyncio.get_event_loop()

def work():
    sleep(1)

async def producer():
    for i in range(num_jobs):
        results = await loop.run_in_executor(executor, work)
        await queue.put(results)

async def consumer():
    completed = 0
    while completed < num_jobs:
        job = await queue.get()
        completed += 1

s = time()
loop.run_until_complete(asyncio.gather(producer(), consumer()))
print("duration", time() - s)

在具有4个以上内核的计算机上运行上述操作大约需要4秒。您如何编写producer以使上面的示例仅需~1秒?

推荐答案

await loop.run_in_executor(executor, work)阻止循环,直到work完成,因此一次只有一个函数在运行。

若要并发运行作业,可以使用asyncio.as_completed

async def producer():
    tasks = [loop.run_in_executor(executor, work) for _ in range(num_jobs)]
    for f in asyncio.as_completed(tasks, loop=loop):
        results = await f
        await queue.put(results)

这篇关于将异步与多工作器ProcessPoolExecutor相结合的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持编程学习网!

本文标题为:将异步与多工作器ProcessPoolExecutor相结合

基础教程推荐