python - multithreaded urllib2 freezes on nose framework -
i have python code uses nose_parameterized below:
from nose_parameterized import parameterized multiprocessing.pool import threadpool import urllib2 def make_http_call(url, req_type): opener = urllib2.build_opener() # <=== line causes freeze return 1 pool = threadpool(processes=4) results = [] urls = ['a', 'b', 'c', 'd'] url in urls: results.append(pool.apply_async(make_http_call, (url, 'html'))) d = {'add': []} ind, res in enumerate(results): d['add'].append((res.get(), 2+ind, 3+ind)) @parameterized(d['add']) def test_add(a, b, c): assert a+b == c
this dummy version of code. basically, need load test parameters http request responses , since there lots of urls, want multithread them. add urllib2.build_opener, freezes using nose (but still works fine python) also, i've tried urllib2.urlopen; same problem. ideas whether there 'proper' (debuggable) way work around this?
you can use nose multiprocess built in plugin that, like:
from nose_parameterized import parameterized import urllib2 urls = ['http://www.google.com', 'http://www.yahoo.com'] @parameterized(urls) def test_add(url): = urllib2.urlopen(url).read() b = 2 + urls.index(url) c = 3 + urls.index(url) assert a+str(b) == str(c)
and run nosetests --processes=2
. enables distribute test run among set of worker processes run tests in parallel intended. behind scenes, multiprocessing module used.
Comments
Post a Comment