Infinite "background" loop for Tornado WebSockets handler - websocket

I am trying to create a WebSocket server using Tornado. What I would like to do is execute a specific command, that will dispatch a message for every cycle of the IOLoop.
To make it more clear; let's say I have the following WebSocket handler
class MyHandler(websocket.WebSocketHandler):
def auto_loop(self, *args, **kwargs):
self.write_message('automatic message')
Is there any way to run auto_loop on every IOLoop cycle, without blocking the main thread?
I suppose that I can use greenlets for that, but I am searching for a more Tornado-native solution.
Thank you

You shouldn't write a message on every IOLoop cycle: you'll overwhelm your system. You want to send it every few milliseconds or seconds. A coroutine will do nicely:
import datetime
from tornado.ioloop import IOLoop
from tornado import gen
handlers = set()
#gen.coroutine
def auto_loop():
while True:
for handler in handlers:
handler.write_message('automatic message')
yield gen.Task(
IOLoop.current().add_timeout,
datetime.timedelta(milliseconds=500))
if __name__ == '__main__':
# ... application setup ...
# Start looping.
auto_loop()
IOLoop.current().start()
In MyHandler.open(), do handlers.add(self), and in MyHandler.on_close() do handlers.discard(self).

You might also be interested in PeriodicCallback
http://tornado.readthedocs.org/en/latest/ioloop.html#tornado.ioloop.PeriodicCallback
def callback(self):
self.write_message("Hi there!")
def open(self):
self.write_message("Connected.")
self.pCallback = PeriodicCallback(self.callback,
callback_time=250)
self.pCallback.start()

Related

How to call async method from greenlet (playwright)

My framework (Locust, https://github.com/locustio/locust) is based on gevent and greenlets. But I would like to leverage Playwright (https://playwright.dev/python/), which is built on asyncio.
Naively using Playwrights sync api doesnt work and gives an exception:
playwright._impl._api_types.Error: It looks like you are using Playwright Sync API inside the asyncio loop.
Please use the Async API instead.
I'm looking for some kind of best practice on how to use async in combination with gevent.
I've tried a couple different approaches but I dont know if I'm close or if what I'm trying to do is even possible (I have some experience with gevent, but havent really used asyncio before)
Edit: I kind of have something working now (I've removed Locust and just directly spawned some greenlets to make it easier to understan). Is this as good as it gets, or is there a better solution?
import asyncio
import threading
from playwright.async_api import async_playwright
import gevent
def thr(i):
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
loop.run_until_complete(do_stuff(i))
loop.close()
async def do_stuff(i):
playwright = await async_playwright().start()
browser = await playwright.chromium.launch(headless=False)
page = await browser.new_page()
await page.wait_for_timeout(5000)
await page.goto(f"https://google.com")
await page.close()
print(i)
def green(i):
t = threading.Thread(target=thr, args=(i,))
t.start()
# t.join() # joining doesnt work, but I couldnt be bothered right now :)
g1 = gevent.spawn(green, 1)
g2 = gevent.spawn(green, 2)
g1.join()
g2.join()
Insipred by #user4815162342 's comment, I went with something like this:
from playwright.async_api import async_playwright # need to import this first
from gevent import monkey, spawn
import asyncio
import gevent
monkey.patch_all()
loop = asyncio.new_event_loop()
async def f():
print("start")
playwright = await async_playwright().start()
browser = await playwright.chromium.launch(headless=True)
context = await browser.new_context()
page = await context.new_page()
await page.goto(f"https://www.google.com")
print("done")
def greeny():
while True: # and not other_exit_condition
future = asyncio.run_coroutine_threadsafe(f(), loop)
while not future.done():
gevent.sleep(1)
greenlet1 = spawn(greeny)
greenlet2 = spawn(greeny)
loop.run_forever()
The actual implementation will end up in Locust some day, probably after some optimization (reusing browser instance etc)
Here's a simple way to integrate asyncio and gevent:
Run an asyncio loop in a dedicated thread
Use asyncio.run_coroutine_threadsafe() to run a coroutine
Use gevent.event.Event to wait until the coroutine resolves
import asyncio
import threading
import gevent
loop = asyncio.new_event_loop()
loop_thread = threading.Thread(target=loop.run_forever, daemon=True)
loop_thread.start()
async def your_coro():
# ...
def wait_until_complete(coro):
future = asyncio.run_coroutine_threadsafe(coro, loop)
event = gevent.event.Event()
future.add_dome_callback(lambda _: event.set())
event.wait()
return future.result()
result = wait_until_complete(your_coro())

How to combine callback-based library with asyncio library in Python?

I have the following issue. I want to read out keystrokes with the pynput library and send them over websockets. Pynput proposes the following usage
from pynput import keyboard
def on_press(key):
try:
print('alphanumeric key {0} pressed'.format(
key.char))
except AttributeError:
print('special key {0} pressed'.format(
key))
def on_release(key):
print('{0} released'.format(
key))
if key == keyboard.Key.esc:
# Stop listener
return False
# Collect events until released
with keyboard.Listener(
on_press=on_press,
on_release=on_release) as listener:
listener.join()
# ...or, in a non-blocking fashion:
listener = keyboard.Listener(
on_press=on_press,
on_release=on_release)
listener.start()
(taken from https://pynput.readthedocs.io/en/latest/keyboard.html)
In contrast to that, the websocket-client library is called as follows:
import asyncio
import websockets
async def hello():
uri = "ws://localhost:8765"
async with websockets.connect(uri) as websocket:
name = input("What's your name? ")
await websocket.send(name)
print(f"> {name}")
greeting = await websocket.recv()
print(f"< {greeting}")
asyncio.get_event_loop().run_until_complete(hello())
(taken from https://websockets.readthedocs.io/en/stable/intro.html).
I am now struggling how this can be done as the websocket library is asynchronous and pynput is synchronous. I somehow have to inject a "websocket.send()" into on_press/on_release - but currently I am struggling with this.
Note that your pynput example contains two different variants of using pynput, from which you need to choose the latter because it is easier to connect to asyncio. The keyboard listener will allow the program to proceed with the asyncio event loop, while invoking the callbacks from a separate thread. Inside the callback functions you can use call_soon_threadsafe to communicate the key-presses to asyncio, e.g. using a queue. For example (untested):
def transmit_keys():
# Start a keyboard listener that transmits keypresses into an
# asyncio queue, and immediately return the queue to the caller.
queue = asyncio.Queue()
loop = asyncio.get_event_loop()
def on_press(key):
# this callback is invoked from another thread, so we can't
# just queue.put_nowait(key.char), we have to go through
# call_soon_threadsafe
loop.call_soon_threadsafe(queue.put_nowait, key.char)
pynput.keyboard.Listener(on_press=on_press).start()
return queue
async def main():
key_queue = transmit_keys()
async with websockets.connect("ws://localhost:8765") as websocket:
while True:
key = await key_queue.get()
await websocket.send(f"key pressed: {key}")
asyncio.run(main())

flask_socketio concurrency using eventlet

I am trying to handle multiple concurrent requests using flask_socketio and eventlet. However, it does not work as expected: When function test1() is running, it blocks the execution of function test2() as seen in the output log.
How can I achieve that the server handles both requests simultaneously?
Server (Python):
import eventlet
eventlet.monkey_patch()
from flask import Flask, render_template
from flask_socketio import SocketIO, send, emit
app = Flask(__name__)
socketio = SocketIO(app, async_mode='eventlet')
#socketio.on('test1')
def test1():
print('test1 started')
do_complicated_calculation() # takes some time
print('test1 done')
#socketio.on('test2')
def test2():
print('test2')
if __name__ == '__main__':
socketio.run(app)
Client (JavaScript):
import io from 'socket.io-client';
socket = io('http://localhost:5000');
socket.emit('test1');
socket.emit('test2');
Expected Output:
test1 started
test2
test1 done
Actual Output:
test1 started
test1 done
test2
As discussed on GitHub, you need to insert socketio.sleep(0) calls as often as you can inside your long computation, ideally inside a loop so that it happens at regular intervals. That will allow the eventlet scheduler to give the CPU to your second task while the first task is running.

Using the Decorator approach with AutobahnWS, how to publish messages independent from subscription callbacks and it's Session-Reference?

When working with Autobahn and WAMP before I have been using the Subclassing-Approach but stumbled over decorator / functions approach which I really prefer over subclassing.
However. I have a function that is being called from an external hardware (via callback) and this function needs to publish to Crossbar.io Router whenever it is being called.
This is how I've done this, keeping a reference of the Session right after the on_join -> async def joined(session, details) was called.
from autobahn.asyncio.component import Component
from autobahn.asyncio.component import run
global_session = None
comp = Component(
transports=u"ws://localhost:8080/ws",
realm=u"realm1",
)
def callback_from_hardware(msg):
if global_session is None:
return
global_session.publish(u'com.someapp.somechannel', msg)
#comp.on_join
async def joined(session, details):
global global_session
global_session = session
print("session ready")
if __name__ == "__main__":
run([comp])
This approach of keeping a reference after component has joined connection feels however a bit "odd". Is there a different approach to this? Can this done on some other way.
If not than it feels a bit more "right" with subclassing and having all the application depended code within that subclass (but however keeping everything of my app within one subclass also feels odd).
I would recommend to use asynchronous queue instead of shared session:
import asyncio
from autobahn.asyncio.component import Component
from autobahn.asyncio.component import run
queue = asyncio.queues.Queue()
comp = Component(
transports=u"ws://localhost:8080/ws",
realm=u"realm1",
)
def callback_from_hardware(msg):
queue.put_nowait((u'com.someapp.somechannel', msg,))
#comp.on_join
async def joined(session, details):
print("session ready")
while True:
topic, message, = await queue.get()
print("Publishing: topic: `%s`, message: `%s`" % (topic, message))
session.publish(topic, message)
if __name__ == "__main__":
callback_from_hardware("dassdasdasd")
run([comp])
There are multiple approaches you could take here, though the simplest IMO would be to use Crossbar's http bridge. So whenever an event callback is received from your hardware, you can just make a http POST request to Crossbar and your message will get delivered
More details about http bridge https://crossbar.io/docs/HTTP-Bridge-Publisher/

send data from celery to tornado websocket

I have some periodic tasks which I execute with Celery (parse pages).
Also I established a websocket with tornado.
I want to pass data from periodic tasks to tornado, then write this data to websocket and use this data on my html page.
How can I do this?
I tried to import module with tornado websocket from my module with celery tasks, but ofcourse, that didn't work.
I know only how to return some data, if I get a message from my client-side. Here is how I cope with it:
import tornado.httpserver
import tornado.websocket
import tornado.ioloop
import tornado.web
import socket
'''
This is a simple Websocket Echo server that uses the Tornado websocket handler.
Please run `pip install tornado` with python of version 2.7.9 or greater to install tornado.
This program will echo back the reverse of whatever it recieves.
Messages are output to the terminal for debuggin purposes.
'''
class handler():
wss = []
class WSHandler(tornado.websocket.WebSocketHandler):
def open(self):
print ('new connection')
if self not in handler.wss:
handler.wss.append(self)
def on_message(self, message):
print ('message received: ' + message)
wssend('Ihaaaa')
def on_close(self):
print ('connection closed')
if self in handler.wss:
handler.wss.remove(self)
def check_origin(self, origin):
return True
def wssend(message):
print(handler.wss)
for ws in handler.wss:
if not ws.ws_connection.stream.socket:
print ("Web socket does not exist anymore!!!")
handler.wss.remove(ws)
else:
print('I am trying!')
ws.write_message(message)
print('tried')
application = tornado.web.Application([
(r'/ws', WSHandler),
])
if __name__ == "__main__":
http_server = tornado.httpserver.HTTPServer(application)
http_server.listen(8888)
myIP = socket.gethostbyname(socket.gethostname())
print ('*** Websocket Server Started at %s***' % myIP)
main_loop = tornado.ioloop.IOLoop.instance()
main_loop.start()
The option is to make a handle in tornado and then post results of celery task to this handle.
After that, there will be an opportunity to pass this data to websocket.

Resources