Hi all,
For my project I'm doing a comparison between solving ODE's in MATLAB and using the GPU to solve them. Anyone know of an accurate way of timing how long the program takes to run on the gpu (same kind of accuracy provided by matlabs tic toc command)?
I have searched the forums and clock() apparently is not that accurate, another method depended on which compiler you were using and one person talk about using the profiler (?).