python - time.clock(), Odd result when used inside function-definition -


using following code (python 3.3.x, winxp):

## debug function in general/personal debug include file. def timer_compare(time1, time2='', note='@', time3=time.clock()):     print('time1',time1)     time2 = time.clock() ## same function-passed time.clock() (just little later)     print('time2',time2)     print('time3',time3)     exit(321) 

caller code used in main code file:

time0 = time.clock() ## <other unrelated code.> timer_compare(time0, time.clock()) 

i following output:

time1 0.0445(snip) time2 0.0445(snip) time3 0.0000043(snip) <- 4.385582001116343e-06 

time3 here seems have way low number. (it kinda looks pulled newly created timer case.)

what's going on here / i'm missing ?

  • i know time.time() preferred/advised on time.clock(), , why.

function defaults created at definition time, not when called. timer_compare function object , defaults evaluated when created , stored attribute on object.

since function created moment module imported (or when top-level script first loaded python), time.clock() value low.

use sentinel instead:

def timer_compare(time1, time2='', note='@', time3=none):     if time3 none:         time3 = time.clock() 

Comments

Popular posts from this blog

javascript - Jquery show_hide, what to add in order to make the page scroll to the bottom of the hidden field once button is clicked -

javascript - Highcharts multi-color line -

javascript - Enter key does not work in search box -