Hi,
I am trying to develop a mechanism which should be able to retrieve some tasks and run them in the background. Usually these tasks are something like calling a view in a given context or calling a certain method bound to a context. Before executing a transaction should be started and in the end there should be a commit.
Before
In the last project I developed a few tables within an Oracle database because it also was our Relstorage at the same time. Then from withing Plone jobs were defined and then added to the database, which had its own transactions. Then in the background there was a cronjob that constantly called a dedicated client using curl/wget which then retrieved the jobs from the table and executed them in the correct context and as the original user. Using a bit of jQuery and Ajax I was able to create a viewlet which showed the jobs in the queue.
Some ideas
Then a few weeks ago I created something similar but runs in a separate thread in the same client process. It looks like this:
import requests
import threading
import queue
requestQueue = queue.Queue()
def requestWorker():
while True:
request = requestQueue.get()
request['sessionMethod'](*request['args'], **request['kwargs'])
requestQueue.task_done()
def addRequest(sessionMethod, *args, **kwargs):
requestQueue.put(dict(
sessionMethod=sessionMethod,
args=args,
kwargs=kwargs
))
threading.Thread(target = requestWorker, daemon=True).start()
And could be called like this:
session = requests.Session():
addRequest(session.post, 'http://plone.org', data = {'some': 'data'}, timeout = 15)
Now I am able to add background requests to arbitrary URLs with some POST data without interrupting a usual Plone view request. If the requests takes a long time , it is going to be no problem. The user gets its response directly. But of course he does not know if it was successful. But this time this was not the desired target. The requests were allowed to fail.
What I really need
So, now I need something like I described in the first chapter but I do not want to use cronjobs, bash scripts, external curl/wget processes, dedicated database tables and dedicated clients. I need something like an other thread running in parallel with the client itself but it's only job is to get a request and run it asynchronously. I want to add the current environment (user, permissions, etc), the original request (or a modified one) and context the requestQueue and the background thread should be able to run it accordingly. I know that a zope2instance can have multiple worker threads, so my question is now how I can create one of these workers dynamically from within my addon and push requests to it.
Btw. it looks my topic is quite similar to this one: Asynchronous tasks with Plone 5.x and WSGI