Hi all,
I'm programming a graph drawing simulation and now I want to improve the running time.
To improve the running time I first printed the time needed to calculate the values on the console and there I saw that most time no time is needed but sometimes time is needed to calculate the values or change things in the input string...
The thing I don't realy understand is that the time needed is always 15 or 16 and this times come from some very different method calls. (like I sad sometimes calculating values, sometimes changing a string, sometimes rounding numbers...)
This methodcalls which need this 15 or 16 milliseconds seem to occur randomly so my question:
Why do the methods sometimes need not one ms and sometimes 15 or 16 ms?
It is not realy a problem but I would be interested in where this times come from or why it are always 15 or 16 milliseconds.
Would be great if anyone could explain this to me.

