c# - Is System.Diagnostics.Stopwatch inaccurate? -


i use system.diagnostics.stopwatch measure how long time execution of code takes. documentation says should quite exact. believed measured time can differ truth time used start , stop stopwatch.

i estimated how big error following code:

var sw = new stopwatch(); sw.start(); sw.stop(); 

it measured: 00:00:00.0000090. wow. error under 10us.

now after years started curious stopwatch again. exact? measure wall clock time? or definition of measures? if process blocked io operation? still exact?

i did small benchmark again:

var sw = new stopwatch(); var starttime = system.datetime.now; sw.start(); (int = 0; < 1000; i++) {     console.writeline(i);     console.error.writeline(i); } var endtime = system.datetime.now; sw.stop(); console.error.writeline(     @"the application running {0}, stopwatch measured {1}.",     endtime - starttime,     sw.elapsed ); 

when started application output was:

the application running 00:00:01.5740900, stopwatch measured 00:00:01.5732109. difference 1ms.

when pipe standard output of application process waits time before reading input following result:

the application running 00:04:51.2076561, stopwatch measured 00:04:51.2012677.

the difference more 6ms.

what happening? stopwatch inaccurate? or misinterpret , isn't measured?

edit: know system.datetime.now has lower resolution, expect difference in ms.

i think dont need rely on single measurement. need run check several times(may 20 50 times) , take average.

also note stopwatch shell on queryperformancecounter functionality.

you can example check article: performance tests: precise run time measurements system.diagnostics.stopwatch


Comments

Popular posts from this blog

angularjs - ADAL JS Angular- WebAPI add a new role claim to the token -

php - CakePHP HttpSockets send array of paramms -

node.js - Using Node without global install -