Working with frameworks that leverage the asyncio paradigm is now familiar to lots of Python
programmers with popular asynchronous frameworks like FastAPI, Faust, Trio (name yours...).
We need sometimes to use good libs that do not (yet ?) leverage the - not so - new asyncio way,
and provide only blocking APIs. Think about SQLAlchemy, or others.
Using such services from a coroutine under event loop control will lock the event loop while
executing thus yield a performance issue.
Hopefully the stdlib provides the concurrent.futures package that let us work around this
annoyance and make a coroutine from a blocking function or method.
The code
The demo
Just paste this code under the above one...
About the pool executor
You may choose the _executor being either None, a ThreadPoolexecutor or a
ProcessPoolExecutor instance. Just read the doc of concurrent.futures to choose the best
suited. And read the Drawbacks below.
Pool executor constructor take an optional max_workers argument you should perhaps tune to
obtain the best possible performances.
My hints: choose a ThreadPoolExecutor for I/O expensive functions, and a ProcessPoolExecutor
for heavy computation callables.
Drawbacks
Using threads
You must be careful when playing with shared objects that could be modified by your blocking code
and coroutines under event loop control. Avoid this when possible, or use locks.
In addition lots of trird party packages expose resources that are not "thread safe", thus cannot be
invoked with this recipe.
Using processes
Processes do not share any global. Blocking code and coroutines must communicate via a
multiprocessing.Queue. That's not very comfortable.
Comments !