ChatGPT解决这个技术问题 Extra ChatGPT

Get elapsed time in Qt

I'm looking for the equivalent in Qt to GetTickCount()

Something that will allow me to measure the time it takes for a segment of code to run as in:

uint start = GetTickCount();
// do something..
uint timeItTook = GetTickCount() - start;

any suggestions?

For Qt 6, you are not able to use QTime since QTime::elapsed() and QTime::start() are already obsolete in Qt 5.

B
BaCaRoZzo

I think it's probably better to use QElapsedTimer since that is why the class exists in the first place. It was introduced with Qt 4.7. Note that it is also immuned to system's clock time change.

Example usage:

#include <QDebug>
#include <QElapsedTimer>
...
...
QElapsedTimer timer;
timer.start();
slowOperation();  // we want to measure the time of this slowOperation()
qDebug() << timer.elapsed();

B
BaCaRoZzo

How about QTime? Depending on your platform it should have 1 millisecond accuracy. Code would look something like this:

QTime myTimer;
myTimer.start();
// do something..
int nMilliseconds = myTimer.elapsed();

On my WinXP virtual machine it seems to only have 10 ms accuracy - can anyone confirm/deny this? I get values of 0, 10, and 20 for an operation I'm testing.
Windows is not as accurate as a UNIX-like OS when timing.
IIRC, on Windows XP the default reported system clock resolution is 15ms, but with some simple windows-dependent winapi calls you may still get better resolution if only there's 1ms-or-better RTSC on the mainboard
QTime doesn't give CPU time. It gives total real time, and that means you're measuring the time taken by all other processes as well. So it's not very useful for measuring execution time of code.
This is a subtle and terrible bug waiting to happen. It's affected by the system clock. God forbid Daylight Savings Time happens while your timer is running.
s
sivabudh

Even if the first answer was accepted, the rest of the people who read the answers should consider sivabudh's suggestion.
QElapsedTimer can also be used to calculate the time in nanoseconds.
Code example:

QElapsedTimer timer;
qint64 nanoSec;
timer.start();
//something happens here
nanoSec = timer.nsecsElapsed();
//printing the result(nanoSec)
//something else happening here
timer.restart();
//some other operation
nanoSec = timer.nsecsElapsed();

Again: This measures real time, not CPU time consumed by the process.
It calculates it by taking the number of processor ticks that the application consumed and multiplying by the number of nanoseconds per Tick. It measure the CPU time consumed by the process.
It measures the time elapsed since start(), not the time consumed by the process. It's a real-time timer. When the process gets preempted (due to multitasking), time continues to pass, and QElapsedTimer will measure that too. QElapsedTimer would be pretty much useless if it would stop measuring time when the process is preempted.
@NikosC. From the Qt blog "Qt has a number of timers, but the most useful one for benchmarking is QElapsedTimer", then "QElapsedTimer will use the most accurate clock available. This however also means that the actual resolution and accuracy of the timer can vary greatly between systems.". Where it chooses the most accurate clock from this ones: qt-project.org/doc/qt-5/qelapsedtimer.html#ClockType-enum .
D
Damien

Expending the previous answers, here is a macro that does everything for you.

#include <QDebug>
#include <QElapsedTimer>
#define CONCAT_(x,y) x##y
#define CONCAT(x,y) CONCAT_(x,y)

#define CHECKTIME(x)  \
    QElapsedTimer CONCAT(sb_, __LINE__); \
    CONCAT(sb_, __LINE__).start(); \
    x \
    qDebug() << __FUNCTION__ << ":" << __LINE__ << " Elapsed time: " <<  CONCAT(sb_, __LINE__).elapsed() << " ms.";

And then you can simple use as:

CHECKTIME(
    // any code
    for (int i=0; i<1000; i++)
    {
       timeConsumingFunc();
    }
)

output:

onSpeedChanged : 102 Elapsed time: 2 ms.


B
BaCaRoZzo

If you want to use QElapsedTimer, you should consider the overhead of this class.

For example, the following code run on my machine:

static qint64 time = 0;
static int count = 0;
QElapsedTimer et;
et.start();
time += et.nsecsElapsed();
if (++count % 10000 == 0)
    qDebug() << "timing:" << (time / count) << "ns/call";

gives me this output:

timing: 90 ns/call 
timing: 89 ns/call 
...

You should measure this for yourself and respect the overhead for your timing.


I agree. I tried QElapsedTimer. It seems to have some overhead associated with use of the class. But very minor. The difference is not that much. But QTime seemed to give me a little bit more quicker execution time. I measured the number crunching code of 4 methods (3 times with QTime and 3 with QElapsedTimer). QElapsed timer measured 8.046 seconds on average and QTime measured 8.016 seconds on average, the difference being 30 ms. Not significant for most purposes but maybe it is for absolute precision. This was running QT 5.3.1 32 bit on a Windows 7 64 bit PC Intel i5.
See the thread here qtcentre.org/threads/…