Make multiprocessing.Queue accessible from asyncio [duplicate] - python-asyncio

This question already has answers here:
FastAPI runs api-calls in serial instead of parallel fashion
(2 answers)
Is there a way to use asyncio.Queue in multiple threads?
(4 answers)
Closed 19 days ago.
The community is reviewing whether to reopen this question as of 18 days ago.
Given a multiprocessing.Queue that is filled from different Python threads, created via ThreadPoolExecutor.submit(...).
How to access that Queue with asyncio / Trio / Anyio in a safe manner (context FastAPI) and reliable manner?
I am aware of Janus library, but prefer a custom solution here.
Asked (hopefully) more concisely:
How to implement the
await <something_is_in_my_multiprocessing_queue>
to have it accesible with async/await and to prevent blocking the event loop?
What synchronization mechanism in general would you suggest?
(Attention here: multiprocessing.Queue not asyncio.Queue)

Actually, I figured it out.
Given a method, that reads the mp.Queue:
def read_queue_blocking():
return queue.get()
Comment: And this is the main issue: A call to get is blocking.
We can now either
use `asyncio.loop.run_in_executor' in asyncio EventLoop.
( see https://docs.python.org/3/library/asyncio-eventloop.html#asyncio.loop.run_in_executor ) or
use anyio with await anyio.to_thread.run_sync(...) to execute the blocking retrieval of data from the queue in a separate thread.
For FastAPI
#app.websocket("/ws/{client_id}")
async def websocket_endpoint(websocket: WebSocket, client_id: str):
await websocket.accept()
while True:
import anyio
queue_result = await anyio.to_thread.run_sync(read_queue_blocking)
await websocket.send_text(f"Message text was: {queue_result}")

I remastered the answer to show case when main thread with asyncio loop is feed with data from child processes (ProcessPoolExecutor):
from concurrent.futures import ProcessPoolExecutor
import asyncio
from random import randint
from functools import partial
def some_heavy_task() -> int:
sum(i * i for i in range(10 ** 8))
return randint(1, 9)
def callback(fut: asyncio.Future, q: asyncio.Queue) -> None:
"""callback is used instead of mp.Queue to get feed from child processes."""
loop = asyncio.get_event_loop()
if not fut.exception() and not fut.cancelled():
loop.call_soon(q.put_nowait, f"name-{fut.name}: {fut.result()}")
async def result_picker(q: asyncio.Queue) -> None:
"""Returns results to some outer world."""
while True:
res = await q.get()
# imagine it is websocket
print(f"Result from heavy_work_producer: {res}")
q.task_done() # mark task as done here
async def heavy_work_producer(q: asyncio.Queue) -> None:
"""Wrapper around all multiprocessing work."""
loop = asyncio.get_event_loop()
with ProcessPoolExecutor(max_workers=4) as pool:
heavy_tasks = [loop.run_in_executor(pool, some_heavy_task) for _ in range(12)]
[i.add_done_callback(partial(callback, q=q)) for i in heavy_tasks]
[setattr(t, "name", i) for i, t in enumerate(heavy_tasks)] # just name them
await asyncio.gather(*heavy_tasks)
async def amain():
"""Main entrypoint of async app."""
q = asyncio.Queue()
asyncio.create_task(result_picker(q))
await heavy_work_producer(q)
# do not let result_picker finish when heavy_work_producer is done
# wait all results to show
await q.join()
print("All done.")
if __name__ == '__main__':
asyncio.run(amain())

Related

Shiny for Python: Implementing an asynchronous iterator (almost there)

The endgame is making an app reactive to a non-blocking stream of information (in my particular case a MongoDB ChangeSteam; it could also be a Kafka consumer).
For the sake of reproducibility, in the example below I implement a generic asynchronous iterator AsyncIteratorDummy that mimics the behaviour of a data stream:
import asyncio
from shiny import reactive, ui, Inputs, Outputs, Session, App, render
class AsyncIteratorDummy:
''' Iterate over an asynchronous source n Iterations.'''
def __init__(self, n):
self.current = 0
self.n = n
def __aiter__(self):
return self
async def __anext__(self):
await asyncio.sleep(1)
print(f"get next element {self.current}")
self.current += 1
if self.current > self.n:
raise StopAsyncIteration
return self.current - 1
async def watch_changes(rval: reactive.Value):
async for i in AsyncIteratorDummy(5):
print(f"next element {i}")
rval.set(i)
app_ui = ui.page_fluid(
"This should update automatically",
ui.output_text_verbatim("async_text"),
)
def server(input: Inputs, output: Outputs, session: Session):
triggered_val = reactive.Value(-1)
asyncio.create_task(watch_changes(triggered_val))
#output(id="async_text")
#render.text()
async def _():
return triggered_val.get()
# un/commenting this makes makes the invalidation
# of `triggered_val` effective or not:
#reactive.Effect
def _():
reactive.invalidate_later(0.1)
app = App(app_ui, server)
The app works because of the presence of
#reactive.Effect
def _():
reactive.invalidate_later(0.1)
Else, async_text greys out (indicating it has been invalidated) but does not update.
Is it possible to implement the asynchronous iteration without the "hack" of the reactive.Effect invalidating on loop?
My supposition is that I have to "flush" or "execute" invalidated variables in the context of watch_changes() (after rval.set(i)), using a low-level py-shiny function that I cannot figure out.
I think you are looking for reactive.flush().
async def watch_changes(rval: reactive.Value):
async for i in AsyncIteratorDummy(5):
print(f"next element {i}")
rval.set(i)
reactive.flush()

Simple syntax to asynchronously get access to MODBUS register

I am trying to run three simple tasks in parallel using asyncio and sharing global variables.
Two of them are working perfectly. One read websockets ("async with websockets.connect("ws://192.168.1.137:9000") as websocket:" Another one access IO via a dedicated library.
I did not find any solution and the good syntax for getting AsyncModbusTCPClient running within the third task (sync MODBUS is easy to implement but would not fit within async task)
The following would just block everything:
async def get_var_modbus(loop):
client = await AsyncModbusTCPClient( schedulers.ASYNC_IO,host="192.168.1.200", loop=loop, port=502, timeout=20, unit=3)
while True:
print("INIT")
print("Reading coils")
rr = await client.read_input_registers(0, 1, unit=0x03)
print(rr.registers)
await asyncio.sleep(1)
Full code below
from pymodbus.client.asynchronous import schedulers
from pymodbus.client.asynchronous.tcp import AsyncModbusTCPClient
import json
import time
from pypx800v5 import *
import aiohttp
import asyncio
import requests_async as requests
import numpy as np
import logging
from datetime import datetime
import websockets
import contextvars
import warnings
warnings.filterwarnings("ignore", category=DeprecationWarning)
# SDM230 via MODBUS
SDM230A=["Voltage","Current","Active Power","Apparent Power","Reactive Power","Power Factor","Phase Angle","Frequency","Import Active Energy","Export Active Energy","Import Reactive Energy","Export Reactive Energy"]
SDM230B=["Total system power demand","Maximum total system power demand","Current system positive power demand","Maximum system positive power demand","Current system reverse power demand","Maximum system reverse power demand"]
SDM230C=["Current demand","Maximum current Demand"]
SDM230D=["Total Active Energy","Total Reactive Energy"]
SDM230Labels=SDM230A+SDM230B+SDM230C+SDM230D
SDM230Var=["Voltage","Current","ActivePower","ApparentPower","ReactivePower","PowerFactor","PhaseAngle","Frequency","ImportActiveEnergy","ExportActiveEnergy","ImportReactiveEnergy","ExportReactiveEnergy","TotalSysPowerDemand","MaxTotalSysPowerDemand","CurrentSysPositivePowerDemand","MaxSysPositivePowerDemand","CurrentSysReversePowerDemand","MaxSysReversePowerDemand","CurrentDemand","MaximumCurrentDemand","TotalActiveEnergy","TotalReactiveEnergy"]
VoltageAdd=262199
CurrentAdd=262200
ActivePowerAdd=262201
ImportActiveEnergyAdd=262202
# inversor via Websockets
TempChaudiereAdd=262198
PuissMaxChauffeauAdd=262193
WREDAdd=262194
PacBat6TLAdd=262195
totPVAdd=262196
SOC6TLAdd=262197
# shared variables
WRED= 0
PacBat6TL=0
PacPV6TL=0
Pac6TLM=0
SOC6TL=0
PAC6TL=0
totPV=0
# --------------------------------------------------------------------------- #
# configure the client logging
# --------------------------------------------------------------------------- #
logging.basicConfig()
log = logging.getLogger()
log.setLevel(logging.DEBUG)
async def get_var_modbus(loop):
client = await AsyncModbusTCPClient( schedulers.ASYNC_IO,host="192.168.1.200", port=502, loop=loop, timeout=20, unit=3)
while True:
print("INIT")
print("Reading coils")
rr = await client.read_input_registers(0, 1, unit=0x03)
print(rr.registers)
await asyncio.sleep(1)
async def get_var_socket():
global WRED
global PacBat6TL
global PacPV6TL
global Pac6TLM
global SOC6TL
global PAC6TL
global totPV
print("")
i=0
dict={}
async with websockets.connect("ws://192.168.1.137:9000") as websocket:
while True:
i=i+1
data=(await websocket.recv())
try:
message=json.loads(data)
except:
break
if "product" in message:
if message["product"]=="ems":
print(message)
if "WRED" in message:
WRED=message["WRED"]
if "PacBat6TL" in message:
PacBat6TL=message["PacBat6TL"]
if "PacPV6TL" in message:
PacPV6TL=message["PacPV6TL"]
totPV=PacPV6TL
if "Pac6TLM" in message:
Pac6TLM=message["Pac6TLM"]
totPV=totPV+Pac6TLM
if "SOC6TL" in message:
SOC6TL=message["SOC6TL"]
if "PAC6TL" in message:
PAC6TL=message["PAC6TL"]
async def get_ipx_update():
print("")
i=0
dict={}
async with IPX800(host='192.168.1.139', api_key='API') as ipx:
await ipx.init_config()
while True:
try:
await ipx.update_ana(WREDAdd,WRED)
except:
print("ERROR")
try:
await ipx.update_ana(PacBat6TLAdd,PacBat6TL)
except:
print("ERROR")
try:
await ipx.update_ana(totPVAdd,totPV)
except:
print("ERROR")
try:
await ipx.update_ana(SOC6TLAdd,SOC6TL)
except:
print("ERROR")
await asyncio.sleep(1)
def main():
loop = asyncio.get_event_loop()
loop.create_task(get_var_socket())
loop.create_task(get_ipx_update())
loop.create_task(get_var_modbus(loop))
loop.run_forever()
if __name__ == '__main__':
try:
main()
except Exception as f:
print('main error: ', f)
sleep(3)
Using the async_modbus library (built on the top of umodbus https://pypi.org/project/async-modbus/) it works very well.
I have used this library with success.
Please find below the syntax,
async def get_var_modbus(loop):
reader, writer = await asyncio.open_connection('192.168.1.200', 502)
client = AsyncTCPClient((reader, writer))loop=loop, port=502, timeout=20, unit=3)
while True:
print("Reading holding registers ADAM3066")
reply = await client.read_holding_registers(slave_id=3, starting_address=0, quantity=8)
print("reply:",reply)
await asyncio.sleep(1)
OUTPUT:
Reading holding registers ADAM3066
reply: [65535 65535 65535 65535 289 65535 65535 65535]
The ADAM 3066 is a RS-485 MODBUS RTU 1-WIRE interface connected to a MODBUS TCP gateway at 192.168.1.200, I have one sensor connected on the input 5 of ADAM 3066 which return a temperature of 28.9 degrees C

How to call async method from greenlet (playwright)

My framework (Locust, https://github.com/locustio/locust) is based on gevent and greenlets. But I would like to leverage Playwright (https://playwright.dev/python/), which is built on asyncio.
Naively using Playwrights sync api doesnt work and gives an exception:
playwright._impl._api_types.Error: It looks like you are using Playwright Sync API inside the asyncio loop.
Please use the Async API instead.
I'm looking for some kind of best practice on how to use async in combination with gevent.
I've tried a couple different approaches but I dont know if I'm close or if what I'm trying to do is even possible (I have some experience with gevent, but havent really used asyncio before)
Edit: I kind of have something working now (I've removed Locust and just directly spawned some greenlets to make it easier to understan). Is this as good as it gets, or is there a better solution?
import asyncio
import threading
from playwright.async_api import async_playwright
import gevent
def thr(i):
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.run_until_complete(do_stuff(i))
loop.close()
async def do_stuff(i):
playwright = await async_playwright().start()
browser = await playwright.chromium.launch(headless=False)
page = await browser.new_page()
await page.wait_for_timeout(5000)
await page.goto(f"https://google.com")
await page.close()
print(i)
def green(i):
t = threading.Thread(target=thr, args=(i,))
t.start()
# t.join() # joining doesnt work, but I couldnt be bothered right now :)
g1 = gevent.spawn(green, 1)
g2 = gevent.spawn(green, 2)
g1.join()
g2.join()
Insipred by #user4815162342 's comment, I went with something like this:
from playwright.async_api import async_playwright # need to import this first
from gevent import monkey, spawn
import asyncio
import gevent
monkey.patch_all()
loop = asyncio.new_event_loop()
async def f():
print("start")
playwright = await async_playwright().start()
browser = await playwright.chromium.launch(headless=True)
context = await browser.new_context()
page = await context.new_page()
await page.goto(f"https://www.google.com")
print("done")
def greeny():
while True: # and not other_exit_condition
future = asyncio.run_coroutine_threadsafe(f(), loop)
while not future.done():
gevent.sleep(1)
greenlet1 = spawn(greeny)
greenlet2 = spawn(greeny)
loop.run_forever()
The actual implementation will end up in Locust some day, probably after some optimization (reusing browser instance etc)
Here's a simple way to integrate asyncio and gevent:
Run an asyncio loop in a dedicated thread
Use asyncio.run_coroutine_threadsafe() to run a coroutine
Use gevent.event.Event to wait until the coroutine resolves
import asyncio
import threading
import gevent
loop = asyncio.new_event_loop()
loop_thread = threading.Thread(target=loop.run_forever, daemon=True)
loop_thread.start()
async def your_coro():
# ...
def wait_until_complete(coro):
future = asyncio.run_coroutine_threadsafe(coro, loop)
event = gevent.event.Event()
future.add_dome_callback(lambda _: event.set())
event.wait()
return future.result()
result = wait_until_complete(your_coro())

How to combine callback-based library with asyncio library in Python?

I have the following issue. I want to read out keystrokes with the pynput library and send them over websockets. Pynput proposes the following usage
from pynput import keyboard
def on_press(key):
try:
print('alphanumeric key {0} pressed'.format(
key.char))
except AttributeError:
print('special key {0} pressed'.format(
key))
def on_release(key):
print('{0} released'.format(
key))
if key == keyboard.Key.esc:
# Stop listener
return False
# Collect events until released
with keyboard.Listener(
on_press=on_press,
on_release=on_release) as listener:
listener.join()
# ...or, in a non-blocking fashion:
listener = keyboard.Listener(
on_press=on_press,
on_release=on_release)
listener.start()
(taken from https://pynput.readthedocs.io/en/latest/keyboard.html)
In contrast to that, the websocket-client library is called as follows:
import asyncio
import websockets
async def hello():
uri = "ws://localhost:8765"
async with websockets.connect(uri) as websocket:
name = input("What's your name? ")
await websocket.send(name)
print(f"> {name}")
greeting = await websocket.recv()
print(f"< {greeting}")
asyncio.get_event_loop().run_until_complete(hello())
(taken from https://websockets.readthedocs.io/en/stable/intro.html).
I am now struggling how this can be done as the websocket library is asynchronous and pynput is synchronous. I somehow have to inject a "websocket.send()" into on_press/on_release - but currently I am struggling with this.
Note that your pynput example contains two different variants of using pynput, from which you need to choose the latter because it is easier to connect to asyncio. The keyboard listener will allow the program to proceed with the asyncio event loop, while invoking the callbacks from a separate thread. Inside the callback functions you can use call_soon_threadsafe to communicate the key-presses to asyncio, e.g. using a queue. For example (untested):
def transmit_keys():
# Start a keyboard listener that transmits keypresses into an
# asyncio queue, and immediately return the queue to the caller.
queue = asyncio.Queue()
loop = asyncio.get_event_loop()
def on_press(key):
# this callback is invoked from another thread, so we can't
# just queue.put_nowait(key.char), we have to go through
# call_soon_threadsafe
loop.call_soon_threadsafe(queue.put_nowait, key.char)
pynput.keyboard.Listener(on_press=on_press).start()
return queue
async def main():
key_queue = transmit_keys()
async with websockets.connect("ws://localhost:8765") as websocket:
while True:
key = await key_queue.get()
await websocket.send(f"key pressed: {key}")
asyncio.run(main())

cx_oracle with Asyncio in Python with SQLAlchemy

I am confused from different thread posted in different time for this topic.
Is this feature of Asyncio available with latest version(As of Dec 2019) of cx_Oracle?
I am using below code snippets which is working but not sure if this is perfect way to do async call for Oracle? Any pointer will be helpful.
import asyncio
async def sqlalchemyoracle_fetch():
conn_start_time = time()
oracle_tns_conn = 'oracle+cx_oracle://{username}:{password}#{tnsname}'
engine = create_engine(
oracle_tns_conn.format(
username=USERNAME,
password=PWD,
tnsname=TNS,
),
pool_recycle=50,
)
for x in test:
pd.read_sql(query_randomizer(x), engine)
!calling custom query_randomizer function which will execute oracle queries from the parameters passed through test which is a list
async def main():
tasks = [sqlalchemyoracle_asyncfetch()]
return await asyncio.gather(*tasks)
if __name__ == "__main__":
result = await main()
I use the cx_Oracle library but not SQLAlchemy. As of v8.2, asyncio is not supported.
This issue tracks and confirms it - https://github.com/oracle/python-cx_Oracle/issues/178.
And no, your code block does not run asynchronously, although defined using async def there is no statement in the code block that is asynchronous. To be asynchronous, your async function either needs to await another async function (that already supports async operations) or use yield to indicate a possible context switch. None of these happens in your code block.
You can try the following package which states to have implemented async support for cx_Oracle. https://pypi.org/project/cx-Oracle-async/

Resources