.net - How can a thread use less than 100% wall time? -
when profiling application (using dottrace), noticed strange thing. used "wall time" measurement, should in theory mean threads run same amount of time.
but wasn't true: threads (actually interested in) displayed total time 2 times less others. example, profiling ran 230 seconds, most threads report 230 seconds spent in thread, 5 threads show 100-110 seconds. these not threadpool threads, , created , started before profiling started.
what going on here?
update i'll add more info may or may not relevant. application in question (it game server) has 20-30 running threads. threads follow simple pattern: check incoming queue work, , work if there some. code thread func looks this:
while(true){ if(trydequeuework()){ // if queue not empty dowork(); // whatever on top }else{ m_waithandle.waitone(maxtimeout); // m_waithandle gets signaled when work added queue } }
the threads display weird times this, except serve multiple queues, this:
while(true){ bool hasanywork=false; foreach(var queue in m_queues){ if(queue.trydequeuework()){ hasanywork=true; dowork(); } } if(!hasanywork){ m_waithandle.waitone(maxtimeout); } }
the weird threads don't io except maybe logging. other, non-weird threads, logging too. time spent waiting waithandle reported in profiler; actually, of non-weird threads spend of time waiting (as never have work).
the application running on 8-core virtual machine (vps hosting). don't know physical processors used there.
did finish before profiler finished, perhaps?
Comments
Post a Comment