time() - How should it work?
-
Is it supposed to return the actual system tick value (in seconds), or is it supposed to return the tick based on FUZE or programs it runs? Right now, I'm finding that it runs even while in the FUZE IDE (code area), than when running a program, it reports a starting value equal to how long the IDE was up. Then, after stopping a program, the tick value resets.
Is this intentional?
-
It should start at zero from when you hit run.
I've noticed another quirk of time() earlier which was both useful and annoying
it used to return the current uptime of the system
-
@MikeDX said in time() - How should it work?:
It should start at zero from when you hit run.
I've noticed another quirk of time() earlier which was both useful and annoying
it used to return the current uptime of the system
Ok, so technically it's a bug, because if I sit on the IDE screen for 10 seconds, then hit run, time() will first report a value of 10, when it should be 0 (or really close because it needs a fraction of time to get to that point in execution).
It can at least be worked around for the time being by getting the value of
time()
from the very start, then subtracting that value from the current time for each instance. -
it is probably fair to say any use of time should be based upon an offset anyway. I've always set a variable to time() at the start of my programs (any language, any platform) and then done a tick count based on time() - otime.
at least that way, when it is fixed it won't matter (or indeed any value it started at)