I have some Python code'm debug which is blocking problem I have some idea to me whether there but I do not know the python thread mechanism well to find out.
Here is the code:
class executed: def Execute_many (commands): with_processes = Zip (command, Seldkprocess_cycle) (about Def) in order, process with_processes: send_command_to_process (process command) writing_thread = threading.Thread (target = writing) process _, with_processes to writing_thread.start (): yield receive_result_from_process (process) thread.join ()
and elsewhere:
results executor.execute_many (orders) for less important things = [make_foo (result)]
< Code> process_cycle Executor
Dawar subprocess.Popen
objects. send_command_to_process
and receive_result_from_process
interact with these processes by pipes
The issue is debugging from time to time that this code is stored:. All popen
processes and writing_thread
are blocked at Flushing after writing to the pipe.
I did not expect it to happen, because (even though the buffers are full) execute_many
generator will yield Receive_result_from_process (procedure)
and processes One of the unblocked (which does not happen - inserting execute_many
inside the loop).
So I came up with a hypothesis, that if writing_thread
is blocked by a full pipe buffer, the main thread is also blocked (they are in the same process).
Is this possible? If so, a Python feature, or Linux feature?
TL; DR
If there is two threads in a Python process and one of them is blocked on flush after writing a full pipe buffer, can it block other threads?
If so, is it a Python feature, or Linux feature?
If a thread is blocked, then can continue execution to ensure other threads.
No comments:
Post a Comment