Hi everyone,
I was wondering if anyone could help -
at the minute I have a program that records how many bytes are received over a set interval (for example, 500 milliseconds). How would you calculate how many megabits have been received per second if the sampling interval will always be different? I have it so far multiplying the number of bytes by 8 to get bits, but how would you get a good estimate of how many per second?
Many thanks in advance :)