multithreading - How do I parallelize GPars Actors? -
my understanding of gpars actors may off please correct me if i'm wrong. have groovy app polls web service jobs. when 1 or more jobs found sends each job dynamicdispatchactor
i've created, , job handled. jobs self-contained , don't need return main thread. when multiple jobs come in @ once i'd them processed in parallel, no matter configuration try actor processes them first in first out.
to give code example:
def poolgroup = new defaultpgroup(new defaultpool(true, 5)) def actor = poolgroup.messagehandler { when {integer msg -> println("i'm number ${msg} on thread ${thread.currentthread().name}") thread.sleep(1000) } } def integers = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10] integers.each { actor << }
this prints out:
i'm number 1 on thread actor thread 31 i'm number 2 on thread actor thread 31 i'm number 3 on thread actor thread 31 i'm number 4 on thread actor thread 31 i'm number 5 on thread actor thread 31 i'm number 6 on thread actor thread 31 i'm number 7 on thread actor thread 31 i'm number 8 on thread actor thread 31 i'm number 9 on thread actor thread 31 i'm number 10 on thread actor thread 31
with slight pause in between each print out. notice each printout happens same actor/thread.
what i'd see here first 5 numbers printed out instantly because thread pool set 5, , next 5 numbers threads free up. off base here?
to make run expect there few changes make:
import groovyx.gpars.group.defaultpgroup import groovyx.gpars.scheduler.defaultpool def poolgroup = new defaultpgroup(new defaultpool(true, 5)) def closure = { when {integer msg -> println("i'm number ${msg} on thread ${thread.currentthread().name}") thread.sleep(1000) stop() } } def integers = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10] def actors = integers.collect { poolgroup.messagehandler(closure) << } actors*.join()
full gist file: https://gist.github.com/wololock/7f1348e04f68710e42d2
then output be:
i'm number 5 on thread actor thread 5 i'm number 4 on thread actor thread 4 i'm number 1 on thread actor thread 1 i'm number 3 on thread actor thread 3 i'm number 2 on thread actor thread 2 i'm number 6 on thread actor thread 3 i'm number 9 on thread actor thread 4 i'm number 7 on thread actor thread 2 i'm number 8 on thread actor thread 5 i'm number 10 on thread actor thread 1
now let's take changed. first of in previous example you've worked on single actor only. defined poolgroup
correctly, created single actor , shifted computation single instance. make run computations in parallel have rely on poolgroup
, send input message handler - pool group handle actors creation , lifecycle management. in:
def actors = integers.collect { poolgroup.messagehandler(closure) << }
it create collection of actors started given input. pool group take care specified pool size not exceeded. have join
each actor , can done using groovy's magic: actors*.join()
. application wait termination until actors stop computation. that's why have add stop()
method when
closure of message handler's body - without it, wont terminate, because pool group not know actors did job - may wait e.g. message.
alternative solution
we can consider alternative solution uses gpars parallelized iterations:
import groovyx.gpars.gparspool // example dummy, let's assume processor // stateless , shared between threads component. class processor { void process(int number) { println "${thread.currentthread().name} starting number ${number}" thread.sleep(1000) } } def integers = [1, 2, 3, 4, 5, 6, 7, 8, 9, 10] processor processor = new processor() gparspool.withpool 5, { integers.eachparallel { processor.process(it) } }
in example have stateless component processor
, paralleled computations using 1 instance of stateless processor
multiple input values.
i've tried figure out case mentioned in comment, i'm not sure if single actor can process multiple messages @ time. statelessness of actor means not change it's internal state during processing of message , must not store other information in actor scope. great if correct me if reasoning not correct :)
i hope you. best!
Comments
Post a Comment