multithreading - Perl Threads Excessive Memory Usage -
i have written perl script runs forever daemon process.
my $child_threads_cnt=5; $q= new thread::queue(); sub processjob{ while (defined(my $taskhandle = $q->dequeue()) ) { $obj=jobprocesser->new(); $result=$obj->getoutput(taskhandle); #jobprocessor big modules computes/manipulates job //mysql query insert '$result' db } } #daemon subroutine sub daemon{ while (1){ @new_request= #'my sql queries returns new requests db' (@new_request){ $q->enqueue($_); } sleep 5; } } #main thread scans db, and, enqueue new job in queue. $scandb=threads->create(\&daemon); #children perform processing @child=map threads->create(\&processjob), 1..child_threads_cnt; $scandb->join; $q->enqueue(undef) 1..child_threads_cnt; (@child){ $_->join; }
i run script in unix(perl v5.8.8) using nohup. script dies after every 4-5 days. no hup not capture log explain reason sudden death of script(main process). anticipated memory leaks, , used top -p pid analyse nohup process.
as script runs continuously , forever, virt(virtual memory) , res memory sizes grow continuously. reached somewhere around 2gb in 10 hours.
essentially, 1 time kid thread finises job(process-job) ie. returns $result
, memory should have been released os.
is thread implementation memory efficient?
what changes , optimization required run script forever, without outage, or without periodic restart?
any help appreciated.
as mentioned in comments - there's no obvious leaks here.
you've avoided of key 'gotchas' thread programming running 'worker' threads queue.
i suggest need @ what's happening your database calls , object creating in processjob
sub. mutual traps in oo create circular reference within object, when object goes out of scope, ref count doesn't ever drop zero.
it might worth checking if destructor called. (http://perldoc.perl.org/perlobj.html#destructors)
destroy { ( $self ) = @_; warn $self . " object destroyed"; }
multithreading perl memory memory-leaks
No comments:
Post a Comment