multithreading - Making massive numbers of HTTP requests in Python -


i'm trying test web application, part of involves making ~10k requests, taking few <1k return 200 ok , going through data. webapp buggy , there false positives, each 200 ok needs @ least triple-checked.

working in python trying threading , urllib, on linux thread errors after ~920 threads. (my theory it's /proc/sys/kernel/threads-max divided thirty eerily accurate, it's perturbing each thread register 30 threads os). in case, i'm looking solution task. i've looked twisted, seems still bound threading.

any ideas?

i testing whit apache ab web server tornado , unable go on 1000 connections per second on dual core athlon @ 2ghz . 30% resources took testing tool ab , remaining server. pretty convinced resources spent os , ip-eth layer.

http://amix.dk/blog/post/19581
non blocking servers have better performance blocking servers since not spawn tread each connection. in theory can run in single tread.


Comments

Popular posts from this blog

python - Scipy curvefit RuntimeError:Optimal parameters not found: Number of calls to function has reached maxfev = 1000 -

c# - How to add a new treeview at the selected node? -

java - netbeans "Please wait - classpath scanning in progress..." -