But that method doesn't give me reasonable results. I'm not talking about inaccuracies due to the limited resolution of the system ticker, I'm talking about it giving me results somewhere between 300 and 700 ms, while the actual time taken for the calculation is > 20,000 ms (roughly checked by looking at my watch).
What is the reason for that extreme inaccuracy and how can I get rid of it? However, I'm not looking for something sophisticated like serious profiling. It doesn't have to be that precise, and I don't have tools for this at hand, as I'm using VC++ Express.